Jobs
Interviews

8555 Data Modeling Jobs - Page 13

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 14.0 years

0 Lacs

karnataka

On-site

As a Consulting/Principal Software Engineer at our organization, you will be part of a Technology team that utilizes contemporary technologies to develop products and systems supporting science and health. Our focus is on providing robust digital solutions to aid in researching significant global issues, with an emphasis on efficient DevOps practices encompassing speed, agility, cost efficiency, and high quality. In this role, you will provide assistance and input to management, leading large multifunctional development activities, solving complex technical problems, writing intricate code for computer systems, and serving as a senior source of expertise. Additionally, you may provide sizing or budget recommendations to management. Your key responsibilities will include acting as the primary technical point of contact for external technology resources, providing design input across a product, working directly with customers and end users, serving as a key person on coding and technical issues, and interfacing with other technical personnel to finalize requirements. You will also be involved in writing and reviewing parts of detailed specifications for the development of system components of moderate complexity, completing complex bug fixes, designing and working with complex data models, and mentoring lead software developers on development methodologies and optimization techniques. To be successful in this role, you should have at least 10 years of Software Engineering experience, a Bachelor's degree in Engineering, Computer Science, or equivalent experience (advanced degree preferred), expertise in software development processes (e.g., Agile, Waterfall), and knowledge of data modeling, design, manipulation, optimization, and best practices. You should also possess familiarity with industry technology language development trends, experience in test-driven development and maintenance, proficiency in multiple data storage subsystems, strong budgeting and finance skills, and proficiency in the use and development of applicable desktop tool sets. Furthermore, you should have experience in partnering and leading internal and external technology resources in solving complex business needs, strong interpersonal skills, experience with various resource models such as Managed Services and/or Staff Augmentation, knowledge of industry best practices in external resource development, solid knowledge of architectural principles, and proficiency with data manipulation languages including optimization techniques. Experience in development languages such as Java/J2EE, JavaScript, JSP, C/C++, HTML, XML, SQL, Windows, UNIX, and .Net, as well as using and developing applicable tool sets, is also required. In addition to technical skills, you should have strong organizational, project planning, time management, and change management skills across multiple functional groups and departments, advanced problem-solving experience, advanced communication (verbal and written) and customer service skills, and the ability to present information concisely and effectively to various audiences. We offer a healthy work/life balance, wellbeing initiatives, shared parental leave, study assistance, and other benefits to support your well-being and long-term goals. Our comprehensive benefits package includes health insurance, life insurance, flexible working arrangements, employee assistance programs, medical screening, family support benefits, long-service awards, and various paid time off options. Join us at our global organization, where we combine quality information and extensive data sets with analytics to support visionary science, research, health education, and interactive learning. Your work with us contributes to addressing the world's grand challenges and fostering a sustainable future, utilizing innovative technologies to support science and healthcare for a better world.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be joining BlitzenX as a seasoned Guidewire Architect responsible for leading the design and architecture of enterprise-grade insurance applications utilizing the Guidewire InsuranceSuite platform. Your expertise across PolicyCenter, BillingCenter, and ClaimCenter will be crucial in driving transformation and innovation in P&C insurance solutions. Your key responsibilities will include owning the end-to-end architecture for Guidewire implementations (PC/BC/CC) and designing scalable, secure, and high-performing solutions within Guidewire and integrated systems. You will provide architectural leadership on large-scale Guidewire projects, including greenfield implementations and cloud migrations. Additionally, you will lead technical teams across onshore/offshore models, mentor developers and tech leads, and collaborate with product owners, enterprise architects, and business stakeholders. As a Guidewire Architect at BlitzenX, you will ensure that solutions comply with architectural standards, best practices, and governance. You will be responsible for conducting code reviews, providing performance optimization guidance, and leveraging your 10+ years of experience in P&C insurance tech, with at least 5 years in the Guidewire platform. Your deep knowledge of Guidewire Configuration, Integration, and Rating, along with your strong experience in Guidewire Cloud, Edge APIs, and GOSU programming, will be invaluable in this role. Your proven success as a Solution or Technical Architect in Guidewire implementations, experience with DevOps, CI/CD pipelines, Docker, Kubernetes, expertise in web services, data modeling, and microservices will be essential. Excellent communication, leadership, and stakeholder management skills are a must-have, along with nice-to-have qualifications such as Guidewire Certification, experience in Agile/Scrum methodologies, and knowledge of AWS/Azure cloud platforms.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

You are an experienced Data Modeller with specialized knowledge in designing and implementing data models for modern data platforms, specifically within the healthcare domain. Your expertise includes a deep understanding of data modeling techniques, healthcare data structures, and experience with Databricks Lakehouse architecture. Your role involves translating complex business requirements into efficient and scalable data models that support analytics and reporting needs. You will be responsible for designing and implementing logical and physical data models for Databricks Lakehouse implementations. Collaboration with business stakeholders, data architects, and data engineers is crucial to create data models that facilitate the migration from legacy systems to the Databricks platform while ensuring data integrity, performance, and compliance with healthcare industry standards. Key responsibilities include creating and maintaining data dictionaries, entity relationship diagrams, and model documentation. You will develop dimensional models, data vault models, and other relevant modeling approaches. Additionally, supporting the migration of data models, ensuring alignment with overall data architecture, and implementing data modeling best practices are essential aspects of your role. Your qualifications include extensive experience in data modeling for analytics and reporting systems, strong knowledge of dimensional modeling, data vault, and other methodologies. Proficiency in Databricks platform, Delta Lake architecture, healthcare data modeling, and industry standards is required. You should have experience in migrating data models from legacy systems, strong SQL skills, and understanding of data governance principles. Technical skills that you must possess include expertise in data modeling methodologies, Databricks platform, SQL, data definition languages, data warehousing concepts, ETL/ELT processes, performance tuning, metadata management, data cataloging, cloud platforms, big data technologies, and healthcare industry knowledge. Your knowledge should encompass healthcare data structures, terminology, coding systems, data standards, analytics use cases, regulatory requirements, clinical and operational data modeling challenges, and population health and value-based care data needs. Your educational background should include a Bachelor's degree in Computer Science, Information Systems, or a related field, with an advanced degree being preferred. Professional certifications in data modeling or related areas would be advantageous for this role.,

Posted 4 days ago

Apply

5.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

You are an experienced Azure Databricks Engineer who will be responsible for designing, developing, and maintaining scalable data pipelines and supporting data infrastructure in an Azure cloud environment. Your key responsibilities will include designing ETL pipelines using Azure Databricks, building robust data architectures on Azure, collaborating with stakeholders to define data requirements, optimizing data pipelines for performance and reliability, implementing data transformations and cleansing processes, managing Databricks clusters, and leveraging Azure services for data orchestration and storage. You must possess 5-10 years of experience in data engineering or a related field with extensive hands-on experience in Azure Databricks and Apache Spark. Strong knowledge of Azure cloud services such as Azure Data Lake, Data Factory, Azure SQL, and Azure Synapse Analytics is required. Experience with Python, Scala, or SQL for data manipulation, ETL frameworks, Delta Lake, Parquet formats, Azure DevOps, CI/CD pipelines, big data architecture, and distributed systems is essential. Knowledge of data modeling, performance tuning, and optimization of big data solutions is expected, along with problem-solving skills and the ability to work in a collaborative environment. Preferred qualifications include experience with real-time data streaming tools, Azure certifications, machine learning frameworks, integration with Databricks, and data visualization tools like Power BI. A bachelor's degree in Computer Science, Data Engineering, Information Technology, or a related field is required for this role.,

Posted 4 days ago

Apply

1.0 - 5.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Data Engineer at Synoptek, you will be responsible for designing, developing, and maintaining robust and scalable data pipelines on the Google Cloud Platform (GCP). You will leverage your hands-on experience with GCP services such as BigQuery, Jitterbit, Cloud Dataflow, Cloud Pub/Sub, and Cloud Storage to build efficient data processing solutions. Collaborating with cross-functional teams, you will translate their data needs into technical requirements, ensuring data quality, integrity, and security throughout the data lifecycle. Your role will involve developing and optimizing ETL/ELT processes to extract, transform, and load data from various sources into data warehouses and data lakes. Additionally, you will build and maintain data models and schemas to support business intelligence and analytics, while troubleshooting data quality issues and performance bottlenecks. To excel in this position, you should have a Bachelor's degree in Computer Science, Engineering, or a related field, along with 3 to 4 years of experience as a Data Engineer focusing on GCP. Proficiency in Python, SQL, and BigQuery is essential, as well as hands-on experience with data ingestion, transformation, and loading tools like Jitterbit and Apache Beam. A strong understanding of data warehousing and data lake concepts, coupled with experience in data modeling and schema design, will be beneficial. The ideal candidate will exhibit excellent problem-solving and analytical skills, working both independently and collaboratively with internal and external teams. Familiarity with acquiring and managing data from various sources, as well as the ability to identify trends in complex datasets and propose business solutions, are key attributes for success in this role. At Synoptek, we value employees who embody our core DNA behaviors, including clarity, integrity, innovation, accountability, and a results-focused mindset. We encourage continuous learning, adaptation, and growth in a fast-paced environment, promoting a culture of teamwork, flexibility, respect, and collaboration. If you have a passion for data engineering, a drive for excellence, and a commitment to delivering impactful results, we invite you to join our dynamic team at Synoptek. Work hard, play hard, and let's achieve superior outcomes together.,

Posted 4 days ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

Job Description: Industry Cover is seeking a dedicated and skilled Data Analyst Manager to join our team in Bengaluru. As the Data Analyst Manager, you will play a crucial role in overseeing data analytics, statistics, data modeling, and communication processes. Your primary responsibility will involve analyzing data, developing data models, and effectively communicating insights to stakeholders. Your contributions will directly impact the enhancement of our innovative products and services that provide exceptional health coverage to employees and their families. The ideal candidate for this full-time on-site role will possess strong analytical skills, expertise in data analytics and statistics, proficiency in data modeling, and exceptional communication abilities. You should have a minimum of 2 years of experience in a data analytics role and hold a Bachelor's or Master's degree in Data Science, Statistics, Mathematics, or a related field. Your role will require you to interpret complex data findings and convey them in a clear and concise manner to ensure effective decision-making and strategic planning within the organization. Join us at Industry Cover and be part of a team that values transparency, trust, and employee well-being. Your role as a Data Analyst Manager will not only contribute to the success of our innovative Employee Benefits Program but also provide you with opportunities for professional growth and development. If you are passionate about data analytics, statistics, and communication, we invite you to apply and make a difference in the lives of employees through impactful health coverage solutions.,

Posted 4 days ago

Apply

7.0 - 11.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Senior Product Owner on a contract basis in a hybrid work environment in Chennai, your primary responsibility will be to serve as an agile product owner, ensuring that our software development projects are aligned with customer needs and deliver maximum value while adhering to agile scrum methodologies. You will drive analysis, validation, and verification to determine the necessary data to support business needs, including its creation, reading, updating, and deletion, along with the associated quality criteria. Your role will involve leading the coordination of efforts with staff, vendors, and customers to understand business requirements and design data architecture, solutions, and processes. You will also support the definition of roadmaps and portfolios of change that reflect business strategy and performance objectives. Additionally, you will lead the development of processes, including models such as conceptual, logical, and physical, and deliver customized reports and recommendations to support ongoing business decisions and customer reporting requirements. To excel in this role, you should have a Bachelor's Degree in a relevant discipline or area, along with a minimum of 7 years of work experience as a business analyst or project manager. Possessing Scrum Product Owner certification is preferred. You must demonstrate sound judgment, attention to detail, accuracy, and follow-through on actions, while also adapting flexibly to an ever-changing work environment. Effective communication of complex ideas in a clear and concise manner, both verbally and in writing, across functional and technical departments is essential. Being able to work on multiple tasks simultaneously, handling conflicting demands, prioritizing workloads, and effectively delegating tasks while maintaining high-quality standards is crucial for success in this role. Expertise in quickly grasping the functions and capabilities of new technologies, as well as strong stakeholder management skills to facilitate change delivery in a busy working environment with competing priorities, are also key requirements. Your high emotional intelligence and solid interpersonal and relationship building skills will be instrumental in establishing strong relationships with teams across the business.,

Posted 4 days ago

Apply

4.0 - 11.0 years

0 Lacs

haryana

On-site

You will be responsible for developing visual reports, dashboards, and KPI scorecards using Power BI desktop. Connecting Power BI to various data sources, importing data, and transforming data as per project requirements will be a key part of your role. You should possess the ability to create data flows, including complex ones and ETL flows over PBI. Writing SQL queries and having a strong understanding of database fundamentals such as multidimensional database design and relational database design is essential. Proficiency in creating complex DAX queries, Power Query, bookmarks, and SQL is required. Implementing row-level security on data and comprehending application security layer models in Power BI are crucial aspects of the job. Additionally, developing tabular/multidimensional models that align with warehouse standards using M-Query is part of your responsibilities. As a suitable candidate, you should hold a Bachelors/Master's degree in computer science/engineering, operations research, or related analytics areas. Candidates with BA/BS degrees in the same fields from top-tier academic institutions are also encouraged to apply. Having 4-11 years of relevant experience in analytics is preferred. Expertise in model development for reports, data transformations, and modeling using Power BI Query Editor is necessary. You will be required to develop and integrate with various data sources, including using varied data connectors to in-house or cloud data stores to stage and shape data for reporting and analytics solutions. Creating relationships between data and developing tabular and other multidimensional data models will be part of your daily tasks. Handling complex data queries and client requirements with hands-on experience on tools and systems in MS SQL Server BI Stack, including SSRS, TSQL, Power Query, MDX, Power BI, Power Pivot, and DAX is crucial for successfully designing, modeling, and implementing end-to-end Power BI solutions. Experience in data gateway for data refresh is expected. Investigating and troubleshooting Power BI reports and data models, including resolving data issues, data validation, and balancing, will be part of your responsibilities. Knowledge of the Insurance Industry is preferred. Excellent written and verbal communication skills are essential for this role. A notice period of 15-30 days is preferable.,

Posted 4 days ago

Apply

8.0 - 14.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Mongo Database Administrator at our company, you will play a crucial role in designing, implementing, and maintaining MongoDB databases. With 8-14 years of experience under your belt, you will be responsible for ensuring the performance, availability, and security of our MongoDB instances. Your expertise in database design, data modeling, performance tuning, and optimization will be key to your success in this role. Key Responsibilities: - Design and implement MongoDB database solutions to support business applications. - Ensure high levels of performance, availability, sustainability, and security. - Analyze and resolve database performance issues. - Develop and maintain database standards and documentation. - Collaborate with development teams to design and optimize database queries. - Perform database tuning and maintenance activities. - Implement backup and recovery strategies. - Monitor database performance and provide recommendations for improvements. - Stay updated with the latest MongoDB features and best practices. Qualifications: - Bachelor's degree in Computer Science, Information Technology, or a related field. - Proven experience as a MongoDB Database Architect or similar role. - In-depth knowledge of MongoDB architecture and design. - Experience with database performance tuning and optimization. - Strong understanding of data modeling and schema design. - Proficiency in SQL and NoSQL databases. - Familiarity with cloud-based database solutions (e.g., AWS, Azure). - Excellent problem-solving and analytical skills. - Strong communication and teamwork abilities. Preferred Qualifications: - MongoDB certification. - Experience with other NoSQL databases (e.g., Cassandra, Couchbase). - Knowledge of scripting languages (e.g., Python, Bash). - Experience with DevOps practices and tools.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

The Data Visualization, Business Analytics role is vital for the organization as it involves transforming intricate data into visual insights for key stakeholders, facilitating informed decision-making and strategic planning. You will collaborate with business leaders to recognize and prioritize data visualization requirements. Your responsibilities will include designing interactive dashboards and reports to illustrate essential business metrics and trends. You will create visually appealing charts, graphs, and presentations that are easily understandable. Furthermore, it is essential to develop and uphold data visualization best practices and standards. As part of the role, you will utilize various data visualization tools and platforms to present insights effectively. Conducting data analysis to identify patterns and trends for visualization purposes will be a key task. Implementing user interface (UI) and user experience (UX) principles to enhance visualization is crucial. Providing training and support to team members on data visualization techniques is also part of the responsibilities. Additionally, you will be responsible for performing ad-hoc analysis and data mining to support business needs. Collaboration with data engineers and data scientists to ensure data accuracy and integrity is essential. It is important to stay updated with industry trends and best practices in data visualization and business analytics. Presenting findings and insights to key stakeholders in a clear and compelling manner will be a regular task. Communication with cross-functional teams to understand data requirements is vital. You will contribute to the continuous improvement of data visualization processes and techniques. The role requires a Bachelor's degree in Data Science, Business Analytics, Computer Science, or a related field. Proven experience in data visualization, business intelligence, or related roles is necessary. Proficiency in data visualization tools like Tableau, Power BI, or D3.js is essential. Strong analytical and problem-solving skills are required. Expertise in SQL for data querying and manipulation is a must. An understanding of statistical concepts and data modeling is crucial. Excellent communication and presentation skills are necessary. The ability to work effectively in a fast-paced and dynamic environment is essential. Knowledge of business operations and strategic planning is required. Experience in interpreting and analyzing complex datasets is beneficial. Familiarity with data warehousing and ETL processes is a plus. Managing multiple projects and deadlines simultaneously, being detail-oriented with a focus on data accuracy and quality, working collaboratively in a team setting, and possessing strong business acumen and understanding of key performance indicators are important skills for this role.,

Posted 4 days ago

Apply

0.0 - 4.0 years

0 Lacs

hyderabad, telangana

On-site

A career within Financial Markets Business Advisory services will provide you with the opportunity to contribute to a variety of audit, regulatory, valuation, and financial analyses services to design solutions that address clients" complex accounting and financial reporting challenges, as well as their broader business issues. To really stand out and make fit for the future in a constantly changing world, each and every one of us at PwC needs to be a purpose-led and values-driven leader at every level. The PwC Professional, our global leadership development framework, gives us a single set of expectations across our lines, geographies, and career paths. It provides transparency on the skills required as individuals to be successful and progress in our careers, now and in the future. Responsibilities As an Associate, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: - Invite and give in the moment feedback in a constructive manner. - Share and collaborate effectively with others. - Identify and make suggestions for improvements when problems and/or opportunities arise. - Handle, manipulate, and analyze data and information responsibly. - Follow risk management and compliance procedures. - Keep up-to-date with developments in the area of specialism. - Communicate confidently in a clear, concise, and articulate manner - verbally and in the materials produced. - Build and maintain an internal and external network. - Seek opportunities to learn about how PwC works as a global network of firms. - Uphold the firm's code of ethics and business conduct. We are seeking a highly motivated Data Engineer - Associate to join our dynamic team. The ideal candidate will have a strong foundation in data engineering, particularly with Python and SQL, and exposure to cloud technologies and data visualization tools such as Power BI, Tableau, or QuickSight. The Data Engineer will work closely with data architects and business stakeholders to support the design and implementation of data pipelines and analytics solutions. This role offers an opportunity to grow technical expertise in cloud and data solutions, contributing to projects that drive business insights and innovation. Key Responsibilities Data Engineering: - Develop, optimize, and maintain data pipelines and workflows to ensure efficient data integration from multiple sources. - Use Python and SQL to design and implement scalable data processing solutions. - Ensure data quality and consistency throughout data transformation and storage processes. - Collaborate with data architects and senior engineers to build data solutions that meet business and technical requirements. Cloud Technologies - Work with cloud platforms (e.g., AWS, Azure, or Google Cloud) to deploy and maintain data solutions. - Support the migration of on-premise data infrastructure to the cloud environment when needed. - Assist in implementing cloud-based data storage solutions, such as data lakes and data warehouses. Data Visualization - Provide data to business stakeholders for visualizations using tools such as Power BI, Tableau, or QuickSight. - Collaborate with analysts to understand their data needs and optimize data structures for reporting. Collaboration And Support - Work closely with cross-functional teams, including data scientists and business analysts, to support data-driven decision-making. - Troubleshoot and resolve issues in the data pipeline and ensure timely data delivery. - Document processes, data flows, and infrastructure for team knowledge sharing. Required Skills And Experience - 0+ years of experience in data engineering, working with Python and SQL. - Exposure to cloud platforms such as AWS, Azure, or Google Cloud is preferred. - Familiarity with data visualization tools (e.g., Power BI, Tableau, QuickSight) is a plus. - Basic understanding of data modeling, ETL processes, and data warehousing concepts. - Strong analytical and problem-solving skills, with attention to detail. Qualifications - Bachelor's degree in Computer Science, Data Science, Information Technology, or related fields. - Basic knowledge of cloud platforms and services is advantageous. - Strong communication skills and the ability to work in a team-oriented environment.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

The company Coupa, known for making companies operate smarter and grow faster through its leading AI-driven platform, is seeking an experienced BI Lead to join their team in Pune, India. As a BI Lead, you will play a crucial role in bridging the gap between business requirements and data analytics, ensuring the fulfillment of analytics requests using Tableau. The ideal candidate for this role will be advanced in Tableau, possess strong skills in building complex dashboards, and excel in administration tasks. Additionally, you will be expected to build strong relationships with business stakeholders and collaborate effectively with data modelers and data architects. At Coupa, the core values are centered around customer success, focusing on results, and striving for excellence. The company is committed to ensuring customer success through innovation, delivering results with a bias for action, and maintaining a collaborative environment infused with professionalism, integrity, passion, and accountability. Please note that Coupa does not accept inquiries or resumes from recruiters. By submitting your application, you acknowledge that Coupa collects and processes your personal data as per their Privacy Policy for managing recruitment activities. If you are successful in your application, your personal data will be used for employment purposes, and if not successful, you may be notified of future job opportunities. More details about data processing and retention can be found in Coupa's Privacy Policy.,

Posted 4 days ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Gurugram

Remote

Data Visualization and BI Manager Join a leading independent service provider specializing in critical telecommunication and renewable energy infrastructure. With a global presence, our company delivers comprehensive engineering, maintenance, repair, and repowering solutions to ensure seamless operations in wireless and wireline telecom, commercial and utility-scale solar and wind projects, EV charging stations, and large-scale power generation and energy storage assets. Your Impact: As the Data Visualization and BI Manager , you will lead a talented team of Power BI developers, leveraging Power BI and Salesforce for advanced reporting and analytics. Your strategic expertise will drive data-driven initiatives to enhance business decisions and performance monitoring. Core Responsibilities: Lead and mentor a globally distributed team of data visualization and dashboard specialists. Oversee the creation and maintenance of comprehensive reports and dashboards using Power BI and Salesforce. Collaborate with data engineers and technical teams to enhance data pipelines, process automation, and data quality. Ensure data accuracy and integrity across all BI tools and systems. Promote the adoption of data analytics best practices and continuously improve BI processes. Present insights and recommendations clearly to stakeholders and executive leadership. Stay updated with industry trends and advancements in data analysis and BI technologies. Core Qualifications: Minimum of 5 years of experience in an English-based work environment. Experience in professional services or telecommunications is preferred. Proven expertise in a BI management role focusing on data visualization and reporting best practices. Strong proficiency in Power BI, including DAX and data modeling, and familiarity with Salesforce reporting. Knowledge of SQL and database management; familiarity with Snowflake is a plus. Excellent leadership and team management skills. Strong analytical and problem-solving abilities. Effective communication and presentation skills. Bachelor's degree in Business Intelligence, Data Science, Computer Science, or a related field. Benefits & Work Environment: Competitive salary and benefits package tailored for employees in India. Flexible work environment with opportunities for professional growth. Supportive, inclusive company culture that promotes collaboration and innovation. Access to necessary tools, company-provided laptop, and software resources.

Posted 4 days ago

Apply

10.0 - 14.0 years

25 - 40 Lacs

Hyderabad

Work from Office

About Protiviti India Protiviti India is a global business consulting firm dedicated to helping leaders confidently face the future. Operating in over 25 countries with more than 90 offices worldwide, we are supported by over 11,000 professionals globally and achieved a global revenue of $2.10 billion as of January 2024. As a wholly-owned subsidiary of Robert Half (NYSE: RHI), we deliver deep expertise across various solutions including Internal Audit & Financial Advisory, Technology & Digital, Financial Services - Risk, Business Performance Improvement, and Managed Business Services. With a strong and growing presence across major Indian cities, Protiviti India prides itself on being a genuinely independent firm with proven methodologies and experienced professionals. Our vision is to bring confidence in a dynamic world, guided by values like integrity, innovation, and collaboration. We also maintain a strong partnership with the Confederation of Indian Industry (CII). We are currently expanding our Enterprise Application Services (EAS) within the Technology & Digital solution area, where we are proud to be a diamond-level partner for SAP Labs and globally ranked as the 11th preferred partner for SAP, with an ambitious goal to reach the top 5. About the Role: SAP Datasphere Architect We are seeking a highly skilled and experienced SAP Datasphere Architect to join our team. This pivotal role is for a seasoned professional with 10-12 years of experience in data architecture, data warehousing, and analytics, including at least 3 years of hands-on experience in SAP Datasphere (formerly SAP Data Warehouse Cloud). The ideal candidate will be instrumental in designing, implementing, and optimizing enterprise-level data integration and analytics solutions leveraging SAP Datasphere, alongside modern cloud and hybrid data architectures. Key Responsibilities: Design and architect scalable, high-performing, and secure data integration and analytics solutions using SAP Datasphere and associated SAP technologies. Provide technical leadership in integrating SAP Datasphere with critical enterprise systems like SAP S/4HANA, SAP BW/4HANA, SAP Analytics Cloud (SAC) , and various non-SAP systems. Define and implement robust data modeling strategies, leveraging SAP Datasphere capabilities such as spaces, semantic layers, and virtual access. Lead the end-to-end implementation of SAP Datasphere projects, from requirement gathering and solution design through development, testing, and deployment. Optimize data flows and ensure efficient data processing for large-scale datasets in both real-time and batch environments. Collaborate closely with business stakeholders to understand requirements and translate them into effective technical solutions, while providing mentorship to junior team members. Required Qualifications & Skills: 10-12 years of experience in data architecture, data warehousing, and analytics, with a minimum of 3 years of hands-on experience in SAP Datasphere . Proficiency in SAP Datasphere , including expertise in spaces, data modeling, and integration features. Strong understanding of the broader SAP ecosystem , including S/4HANA, BW/4HANA, HANA Cloud, SAC, and BTP. Experience with cloud platforms such as Azure, AWS, or Google Cloud, particularly in hybrid data landscapes. In-depth knowledge of data integration technologies , including ETL/ELT tools, APIs, and SAP Data Intelligence. Strong SQL and data modeling skills , with experience in star and snowflake schemas. Proven ability to lead technical teams and manage multiple priorities effectively. Excellent problem-solving, analytical, and stakeholder management abilities. Soft Skills: Excellent problem-solving and analytical skills, with a keen eye for detail. Strong communication and stakeholder management abilities, ensuring effective collaboration. Proven ability to lead technical teams and manage multiple priorities in a dynamic environment. Commitment to continuous learning and staying current with emerging data technologies.

Posted 4 days ago

Apply

4.0 - 9.0 years

14 - 19 Lacs

Chennai

Work from Office

Embark your transformative journey as Data Architect Vice president based out of Chennai You will design, develop, and implement solutions to solve complex business problems, collaborating with stakeholders to understand their needs and requirements, and design and implement solutions that meet those needs and create solutions that balance technology risks against business delivery, driving consistency The role of the data architect is to define the appropriate data architecture with regards to the data products The role will involve implementing the data products principles and standards defined & agreed whilst engaging with the BUK CDA and CDO teams The requirement is to translate business / use case requirements into logical and physical data models which serve as the basis for data engineers to build the data products This includes working with business teams to capture their requirements, translating these into data models while considering performance implications Role will include testing the models with data engineers and continuously monitoring and optimizing the performance of these models Team has to engages with the CDA team to design data product solutions covering data architecture, platform design and integration patterns and ensuring that they are in line with policy and technical data standards They have to defines appropriate data pipeline and design patterns that orchestrates data from source to target in a variety of architectures ,designs and owns enterprise solution architecture roadmap for data products ,collaborates with the technical product lead on data governance requirements including data ownership of data assets and data quality lineage and standards , Partner with business stakeholders to understand their data needs and their desired functionality for the data product Translate these requirements into clear data modelling specifications They have to designs and implements high-quality data models that optimize data access, storage, and analysis ,creates comprehensive and well-maintained documentation of the data models, including entity relationship diagrams, data dictionaries, and usage guidelines collaborates with data engineers to test and validate the data models ,Obtains sign-off from the DPL, DPA and the Technical Product Lead on the logical and physical data models They also have continuously monitor and optimize the performance of the models to ensure efficient data retrieval and processing and collaborates with data engineers to translate into physical data models and throughout the development lifecycle , To be successful as a Senior Engineering and Design Lead, you should have experience with: Cloud platform expertise (AWS)-)AWS Data services proficiency with AWS data related services such as S3, Redshift, Athena, Glue, Lambda,Cloud Architecture Design Designing cloud-native data architectures, optimizing cost and performance on AWS,Data modelling and architecture, Big data technologies -Hadoop ,Data warehousing & Analytics such as -Teradata and Snowflake processes, SQL/scripting, data,Governance and Quality, They should ability to engage with business stakeholders, tech teams and data engineers to define requirements, align data strategies and deliver high value solutions And required proven experience leading cross-functional teams to execute complex data architectures, Some Other Highly Valued Skills May Include Advanced cloud services familiarity with other services and exposure to multi-cloud or hybrid cloud architectures, Data orchestration and automation, Performance tuning and Optimization, Data Visualization, You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills, This role is for Chennai Location, Purpose of the role To design, develop, and implement solutions to complex business problems, collaborating with stakeholders to understand their needs and requirements, and design and implement solutions that meet those needs and create solutions that balance technology risks against business delivery, driving consistency, Accountabilities Design and development of solutions as products that can evolve, meeting business requirements that align with modern software engineering practices and automated delivery tooling This includes identification and implementation of the technologies and platforms, Targeted design activities that apply an appropriate workload placement strategy and maximise the benefit of cloud capabilities such as elasticity, serverless, containerisation etc Best practice designs incorporating security principles (such as defence in depth and reduction of blast radius) that meet the Banks resiliency expectations, Solutions that appropriately balance risks and controls to deliver the agreed business and technology value, Adoption of standardised solutions where they fit If no standard solutions fit, feed into their ongoing evolution where appropriate, Fault finding and performance issues support to operational support teams, leveraging available tooling, Solution design impact assessment in terms of risk, capacity and cost impact, inc estimation of project change and ongoing run costs, Development of the requisite architecture inputs required to comply with the banks governance processes, including design artefacts required for architecture, privacy, security and records management governance processes, Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures, If managing a team, they define jobs and responsibilities, planning for the departments future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements, If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others, OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions, Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment, Manage and mitigate risks through assessment, in support of the control and governance agenda, Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does, Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business, Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies, Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions, Adopt and include the outcomes of extensive research in problem solving processes, Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes, All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave, Show more Show less

Posted 5 days ago

Apply

5.0 - 8.0 years

15 - 20 Lacs

Pune

Work from Office

Calling all innovators find your future at Fiserv, Were Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day quickly, reliably, and securely Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, were involved If you want to make an impact on a global scale, come make a difference at Fiserv, Job Title Tech Lead, Data Architecture What does a successful Snowflakes Advisor do We are seeking a highly skilled and experienced Snowflake Advisor to take ownership of our data warehousing strategy, implementation, maintenance and support In this role, you will design, develop, and lead the adoption of Snowflake-based solutions to ensure scalable, efficient, and secure data systems that empower our business analytics and decision-making processes, As a Snowflake Advisor, you will collaborate with cross-functional teams, lead data initiatives, and act as the subject matter expert for Snowflake across the organization, What You Will Do Define and implement best practices for data modelling, schema design, query optimization in Snowflakes Develop and manage ETL/ELT workflows to ingest, transform and load data into Snowflakes from various resources Integrate data from diverse systems like databases, API`s, flat files, cloud storage etc into Snowflakes, Using tools like Streamsets, Informatica or dbt to streamline data transformation processes Monitor or tune Snowflake performance including warehouse sizing, query optimizing and storage management, Manage Snowflakes caching, clustering and partitioning to improve efficiency Analyze and resolve query performance bottlenecks Monitor and resolve data quality issues within the warehouse Collaboration with data analysts, data engineers and business users to understand reporting and analytic needs Work closely with DevOps team for Automation, deployment and monitoring Plan and execute strategies for scaling Snowflakes environments as data volume grows Monitor system health and proactively identify and resolve issues Implement automations for regular tasks Enable seamless integration of Snowflakes with BI Tools like Power BI and create Dashboards Support ad hoc query requests while maintaining system performance Creating and maintaining documentation related to data warehouse architecture, data flow, and processes Providing technical support, troubleshooting, and guidance to users accessing the data warehouse Optimize Snowflakes queries and manage Performance Keeping up to date with emerging trends and technologies in data warehousing and data management Good working knowledge of Linux operating system Working experience on GIT and other repository management solutions Good knowledge of monitoring tools like Dynatrace, Splunk Serve as a technical leader for Snowflakes based projects, ensuring alignment with business goals and timelines Provide mentorship and guidance to team members in Snowflakes implementation, performance tuning and data management Collaborate with stakeholders to define and prioritize data warehousing initiatives and roadmaps, Act as point of contact for Snowflakes related queries, issues and initiatives What You Will Need To Have Must have 8 to 10 years of experience in data management tools like Snowflakes, Streamsets, Informatica Should have experience on monitoring tools like Dynatrace, Splunk, Should have experience on Kubernetes cluster management CloudWatch for monitoring and logging and Linux OS experience Ability to track progress against assigned tasks, report status, and proactively identifies issues, Demonstrate the ability to present information effectively in communications with peers and project management team, Highly Organized and works well in a fast paced, fluid and dynamic environment, What Would Be Great To Have Experience in EKS for managing Kubernetes cluster Containerization technologies such as Docker and Podman AWS CLI for command-line interactions CI/CD pipelines using Harness S3 for storage solutions and IAM for access management Banking and Financial Services experience Knowledge of software development Life cycle best practices Thank You For Considering Employment With Fiserv Please Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable), Our Commitment To Diversity And Inclusion Fiserv is proud to be an Equal Opportunity Employer All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law, Note To Agencies Fiserv does not accept resume submissions from agencies outside of existing agreements Please do not send resumes to Fiserv associates Fiserv is not responsible for any fees associated with unsolicited resume submissions, Warning About Fake Job Posts Please be aware of fraudulent job postings that are not affiliated with Fiserv Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information Any communications from a Fiserv representative will come from a legitimate Fiserv email address, Show

Posted 5 days ago

Apply

2.0 - 5.0 years

11 - 16 Lacs

Mumbai

Work from Office

About This Role Associate Python Developer (Finance Platform Strategies) We are seeking a technologically adept Data Specialist with a robust Python background and financial acumen to join the Automation & AI team within Finance Platform Strategies (FPS) Our team champions the transformative integration of automation, including Robotic Process Automation (RPA), to streamline and accelerate financial processes, ensuring peak efficiency and workflow optimization, The Finance Platform Strategies (FPS) group, within BlackRocks global Finance & Strategy organization, is responsible for long-term management of Finance platform and technology initiatives, spanning controllers, financial planning, expense management, treasury, tax, and a range of other proprietary and third-party platform capabilities The group drives the strategic vision for and implementation of initiatives to enhance our platform capabilities and delivers day-to-day oversight and management of the platform, with a global footprint, Collaboration is key Youll work closely with partners across the Finance organization, BlackRocks Aladdin Engineering team, Technology & Development Operations (TDO) organization to achieve our goals Join us in shaping the future of finance through innovation and excellence, Core Responsibilities Developing and orchestrating technical solutions with a primary focus on Python frameworks, while also leveraging other technologies such as ETL tools and scripting languages to support the automation of financial processes and workflows, data transformations, and system integrations, Driving projects to completion by understanding requirements and utilizing a wide range of financial applications and automation tools, Ensuring the quality, performance, and reliability of software applications through rigorous testing, debugging, and code reviews, Partnering with functions across the global Finance organization to understand and solution business use cases for automation, Setting up and maintaining servers to support the Python infrastructure, Staying current with the latest developments in Python technologies, as well as industry trends in finance automation, Documenting development work requirements, including technical specifications, user stories, and acceptance criteria, to ensure clear communication and alignment with stakeholders, Mentoring and guiding junior developers to help them grow their skills and knowledge, Working closely with the Aladdin Engineering team and TDO to align technology solutions with business needs, Contributing as a Finance Technology Subject Matter Expert (SME), developing solutions around the inventory of technical tools available within BlackRock, Required Skills And Experience Advanced proficiency in Python technologies, including a deep understanding of frameworks such as Pandas, NumPy and PySpark, to architect and implement robust data transformation solutions, Extensive experience with data modeling, both relational and non-relational, and schema design ( e-g , SQL Server, star and snowflake), Proven expertise in API integration, including RESTful and GraphQL, for data enrichment and processing, Proficiency in data cleaning, normalization, and validation for maintaining high data quality, Strong experience in data science and machine learning, with proficiency in libraries such as Scikit-learn, TensorFlow, or PyTorch, Exposure to Azure Cognitive Services, offering a competitive edge in leveraging AI and machine learning capabilities, Practical experience with cloud platforms, particularly Microsoft Azure, and a solid grasp of cloud services and infrastructure, Proficient in DevOps practices, with experience using tools like Azure DevOps for continuous integration and delivery, Comfortable working in a Linux environment, demonstrating versatility across operating systems, Knowledge of cloud deployment technologies, such as Docker, to facilitate efficient deployment and scaling of applications, Familiarity with real-time data streaming platforms such as Apache Kafka, Understanding of containerization and orchestration technologies like Docker and Kubernetes, Strong command of data governance and security practices, Experience building intuitive and responsive user interfaces using modern frontend technologies such as Streamlit, Dash, Panel, Flask etc Good To Have Experience working with Azure Document Intelligence (ADI) for data processing, Experience with GPT APIs, such as Chat Completion, Familiarity with other programming languages such as C# or Java, adding a valuable dimension to the candidates technical toolkit, Validated experience in software development and the ability to autonomously understand an existing codebase, Curiosity about the functional aspects of the product, with a base knowledge of the finance industry being highly appreciated, Strong analytical and problem-solving skills, with a proactive approach and the ability to balance multiple projects simultaneously, Proficient in English, both written and spoken, Exposure with data visualization tools like Matplotlib, Power BI, or Tableau, Qualifications For candidates in India: B E , b-tech , MCA, or any other relevant engineering degree from a reputed university, A minimum of 5 years of proven experience in the field, Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about, Our hybrid work model BlackRocks hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week Some business groups may require more time in the office due to their roles and responsibilities We remain focused on increasing the impactful moments that arise when we work together in person aligned with our commitment to performance and innovation As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock, About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being Our clients, and the people they serve, are saving for retirement, paying for their childrens educations, buying homes and starting businesses Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress, This mission would not be possible without our smartest investment the one we make in our employees Its why were dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive, For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: linkedin /company/blackrock BlackRock is proud to be an Equal Opportunity Employer We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law, Show

Posted 5 days ago

Apply

6.0 - 11.0 years

10 - 14 Lacs

Chennai

Remote

What Youll Need BS or MS degree in Computer Science, Engineering, or a related technical field Strong SQL skills 6+ years of experience working with event instrumentation, data pipelines, and data warehouses, preferably acting as a data architect in a previous role Proficiency with systems design and data modeling Fluency with workflow management tools, like Airflow or dbt Experience with modern data warehouses, like Snowflake or BigQuery Expertise breaking down complex problems, documenting solutions, and sequencing work to make iterative improvements Familiarity with data visualization tools such as Mode, Tableau, and Looker Familiarity with programming skills, preferably in Python Familiarity with software design principles, including test-driven development About the Role Analytics Platform is on a mission to democratize learning by building systems that enable company-wide analytics and experimentation. By implementing sufficient instrumentation, designing intuitive data models, and building batch/streaming pipelines, we will allow for deep and scalable investigation and optimization of the business. By developing self-serve tools, we will empower executives, PMs, Marketing leadership & marketing managers to understand company performance at a glance and uncover insights to support decision making. Finally, by building capabilities such as forecasting, alerting, and experimentation, we will enable more, better, and faster decisions. What Youll Do Drive direct business impact with executive-level visibility Design technical architecture and implement components from the ground up as we transition to event-based analytics Work on the unique challenge of joining a variety of online and offline data sets, not just big data Learn and grow Data Science and Data Analytics skills (we sit in the same org!) Opportunity to grow into a Tech Lead/Manager, and mentor junior team members as we quickly grow the team Partner with infrastructure and product engineers to instrument our backend services and end-to-end user journeys to create visibility for the rest of the business Design, develop and monitor scalable and cost-efficient data pipelines and build out new integrations with third-party tools Work with data analysts and data scientists to design our data models as inputs to metrics and machine learning models Establish the best practices for data engineering Assess build vs buy tradeoffs for components in our company-wide analytics platform, which will inform decision-making for executives, PMs and Ops, etc. Opportunity to be founding member of the Data Engineer team based out of IN. Will have the autonomy to help shape the vision, influence roadmap and establish best practices for the team

Posted 5 days ago

Apply

6.0 - 9.0 years

18 - 25 Lacs

Bengaluru

Hybrid

About the Role We are seeking a BI Architect to advise the BI Lead of a global CPG organization and architect an intelligent, scalable Business Intelligence ecosystem. This includes an enterprise-wide KPI dashboard suite augmented by a GenAI-driven natural language interface for insight discovery. The ideal candidate will be responsible for end-to-end architecture: from scalable data models and dashboards to a conversational interface powered by Retrieval-Augmented Generation (RAG) and/or Knowledge Graphs. The solution must synthesize internal BI data with external (web-scraped and competitor) data to deliver intelligent, context-rich insights. Key Responsibilities • Architect BI Stack : Design and oversee a scalable and performant BI platform that serves as a single source of truth for key business metrics across functions (Sales, Marketing, Supply Chain, Finance, etc.). • Advise BI Lead : Act as a technical thought partner to the BI Lead, aligning architecture decisions with long-term strategy and business priorities. • Design GenAI Layer : Architect a GenAI-powered natural language interface on top of BI dashboards to allow business users to query KPIs, trends, and anomalies conversationally. • RAG/Graph Approach : Select and implement appropriate architectures (e.g., RAG using vector stores, Knowledge Graphs) to support intelligent, context-aware insights. • External Data Integration : Build mechanisms to ingest and structure data from public sources (e.g., competitor websites, industry reports, social sentiment) to augment internal insights. • Security & Governance : Ensure all layers (BI + GenAI) adhere to enterprise data governance, security, and compliance standards. • Cross-functional Collaboration : Work closely with Data Engineering, Analytics, and Product teams to ensure seamless integration and operationalization. Qualifications • 69 years of experience in BI architecture and analytics platforms, with at least 2 years working on GenAI, RAG, or LLM-based solutions. • Strong expertise in BI tools (e.g., Power BI, Tableau, Looker) and data modeling. • Experience with GenAI frameworks (e.g., LangChain, LlamaIndex, Semantic Kernel) and vector databases (e.g., Pinecone, FAISS, Weaviate). • Knowledge of graph-based data models and tools (e.g., Neo4j, RDF, SPARQL) is a plus. • Proficiency in Python or relevant scripting language for pipeline orchestration and AI integration. • Familiarity with web scraping and structuring external/third-party datasets. • Prior experience in CPG domain or large-scale KPI dashboarding preferred.

Posted 5 days ago

Apply

3.0 - 7.0 years

10 - 11 Lacs

Hyderabad

Work from Office

Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role, you will: 6+ years of experience as a Qlik Developer in the Analysis, Design, Development, Testing and Implementation of business applications. Knowledge in developing data modeling, Front end and BI reports in QlikView/Qliksense/NPrinter. Good knowledge and understanding of overall architecture with respect to Qlikview/Qliksense server, QlikView publisher and Qlikview/Qliksense Management Console. Experience in implementing security for the Qlikview/Qliksense applications. Extensive experience in basic components of Qlikview/Qliksense Enterprise like List Boxes, Multi Boxes, Table Boxes, Text Objects, Book Mark Objects, Search Objects, Charts, Pivot Tables, Straight Tables, Line/Arrow Objects, Buttons, etc. Vizlib Experience Experience in designing Qlikview/Qliksense Document/User Setting, Layouts to make consistent and professional optimized look to Clients. Experience in implementing and supporting production of Qlikview/Qliksense apps. Hands on experience in formulating KPI s within Qlikview/Qliksense using functions and set analysis. Monitor the Qlik environment and check the usage/performance of the environment, reports, etc. Experience in development of Qlikview/Qliksense scripts for Data Modelling along with resolving Synthetic Key and Circular Loop issues. Hands on experience in troubleshooting and optimizing Qlikview/Qliksense loads and application experience. Experience in optimizing existing Qlikview/Qliksense reports with a focus on usability, performance, flexibility, testability, and standardization. Strong understanding of Dimensional Modelling technique, Multi-dimensional database Schemas like Star Schema, Snow flake Schema, Fact and Dimensional tables, Section Access, Set Analysis in Qlikview/Qliksense and DW concepts. Expertise in working with relational databases such as Oracle 11g/9i. Having extensive experience on visualization and performance tuning. Excellent team player with very good communication skills. Strong knowledge on joins and Concatenation to avoid synthetic keys and circular references. Requirements Good Communication and Interpretation Skills Experience in QlikSense & Qlik-N Printing Experience working on Vizlib, Mashup creation Good Hands on experience in Devops pipeline Team management Good to have understanding on Tableau or Power BI or Looker development .

Posted 5 days ago

Apply

3.0 - 8.0 years

13 - 15 Lacs

Bengaluru

Work from Office

Join us to lead data modernization and maximize analytics utility. As a Data Owner Lead at JPMorgan Chase within the Data Analytics team, you play a crucial role in enabling the business to drive faster innovation through data. You are responsible for managing customer application and account opening data, ensuring its quality and protection, and collaborating with technology and business partners to execute data requirements. Job responsibilities Document data requirements for your product and coordinate with technology and business partners to manage change from legacy to modernized data. Model data for efficient querying and use in LLMs, utilizing business data dictionary and metadata. Develop ideas for data products by understanding analytics needs and create prototypes for productizing datasets. Develop proof of concepts for natural language querying and collaborate with stakeholders to rollout capabilities. Support the team in building backlog, grooming initiatives, and leading data engineering scrum teams. Manage direct or matrixed staff to execute data-related tasks. Required qualifications, capabilities, and skills Bachelors degree required. 5+ years experience in data modeling for relational, NoSQL, and graph databases. Expertise in data technologies such as analytics, business intelligence, machine learning, data warehousing, data management & governance, and AWS cloud solutions. Experience with natural language processing, machine learning, and deep learning toolkits (like TensorFlow, PyTorch, NumPy, Scikit-Learn, Pandas). Ability to balance short-term goals and long-term vision in complex environments. Knowledge of open data standards, data taxonomy, vocabularies, and metadata management. Preferred qualifications, capabilities, and skills Master s degree preferred. Join us to lead data modernization and maximize analytics utility. As a Data Owner Lead at JPMorgan Chase within the Data Analytics team, you play a crucial role in enabling the business to drive faster innovation through data. You are responsible for managing customer application and account opening data, ensuring its quality and protection, and collaborating with technology and business partners to execute data requirements. Job responsibilities Document data requirements for your product and coordinate with technology and business partners to manage change from legacy to modernized data. Model data for efficient querying and use in LLMs, utilizing business data dictionary and metadata. Develop ideas for data products by understanding analytics needs and create prototypes for productizing datasets. Develop proof of concepts for natural language querying and collaborate with stakeholders to rollout capabilities. Support the team in building backlog, grooming initiatives, and leading data engineering scrum teams. Manage direct or matrixed staff to execute data-related tasks. Required qualifications, capabilities, and skills Bachelors degree required. 5+ years experience in data modeling for relational, NoSQL, and graph databases. Expertise in data technologies such as analytics, business intelligence, machine learning, data warehousing, data management & governance, and AWS cloud solutions. Experience with natural language processing, machine learning, and deep learning toolkits (like TensorFlow, PyTorch, NumPy, Scikit-Learn, Pandas). Ability to balance short-term goals and long-term vision in complex environments. Knowledge of open data standards, data taxonomy, vocabularies, and metadata management. Preferred qualifications, capabilities, and skills Master s degree preferred.

Posted 5 days ago

Apply

8.0 - 17.0 years

45 - 55 Lacs

Mumbai

Work from Office

We have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible. As a Data Platform Engineering Lead at JPMorgan Chase within Asset and Wealth Management, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Lead the design, development, and implementation of scalable data pipelines and ETL batches using Python/PySpark on AWS. Execute standard software solutions, design, development, and technical troubleshooting Use infrastructure as code to build applications to orchestrate and monitor data pipelines, create and manage on-demand compute resources on cloud programmatically, create frameworks to ingest and distribute data at scale. Manage and mentor a team of data engineers, providing guidance and support to ensure successful product delivery and support. Collaborate proactively with stakeholders, users and technology teams to understand business/technical requirements and translate them into technical solutions. Optimize and maintain data infrastructure on cloud platform, ensuring scalability, reliability, and performance. Implement data governance and best practices to ensure data quality and compliance with organizational standards. Monitor and troubleshoot application and data pipelines, identifying and resolving issues in a timely manner. Stay up-to-date with emerging technologies and industry trends to drive innovation and continuous improvement. Add to team culture of diversity, equity, inclusion, and respect. Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Experience in software development and data engineering, with demonstrable hands-on experience in Python and PySpark. Proven experience with cloud platforms such as AWS, Azure, or Google Cloud. Good understanding of data modeling, data architecture, ETL processes, and data warehousing concepts. Experience or good knowledge of cloud native ETL platforms like Snowflake and/or Databricks. Experience with big data technologies and services like AWS EMRs, Redshift, Lambda, S3. Proven experience with efficient Cloud DevOps practices and CI/CD tools like Jenkins/Gitlab, for data engineering platforms. Good knowledge of SQL and NoSQL databases, including performance tuning and optimization. Experience with declarative infra provisioning tools like Terraform, Ansible or CloudFormation. Strong analytical skills to troubleshoot issues and optimize data processes, working independently and collaboratively. Experience in leading and managing a team/pod of engineers, with a proven track record of successful project delivery. Preferred qualifications, capabilities, and skills Knowledge of machine learning model lifecycle, language models and cloud-native MLOps pipelines and frameworks is a plus. Familiarity with data visualization tools and data integration patterns. We have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible. As a Data Platform Engineering Lead at JPMorgan Chase within Asset and Wealth Management, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Lead the design, development, and implementation of scalable data pipelines and ETL batches using Python/PySpark on AWS. Execute standard software solutions, design, development, and technical troubleshooting Use infrastructure as code to build applications to orchestrate and monitor data pipelines, create and manage on-demand compute resources on cloud programmatically, create frameworks to ingest and distribute data at scale. Manage and mentor a team of data engineers, providing guidance and support to ensure successful product delivery and support. Collaborate proactively with stakeholders, users and technology teams to understand business/technical requirements and translate them into technical solutions. Optimize and maintain data infrastructure on cloud platform, ensuring scalability, reliability, and performance. Implement data governance and best practices to ensure data quality and compliance with organizational standards. Monitor and troubleshoot application and data pipelines, identifying and resolving issues in a timely manner. Stay up-to-date with emerging technologies and industry trends to drive innovation and continuous improvement. Add to team culture of diversity, equity, inclusion, and respect. Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Experience in software development and data engineering, with demonstrable hands-on experience in Python and PySpark. Proven experience with cloud platforms such as AWS, Azure, or Google Cloud. Good understanding of data modeling, data architecture, ETL processes, and data warehousing concepts. Experience or good knowledge of cloud native ETL platforms like Snowflake and/or Databricks. Experience with big data technologies and services like AWS EMRs, Redshift, Lambda, S3. Proven experience with efficient Cloud DevOps practices and CI/CD tools like Jenkins/Gitlab, for data engineering platforms. Good knowledge of SQL and NoSQL databases, including performance tuning and optimization. Experience with declarative infra provisioning tools like Terraform, Ansible or CloudFormation. Strong analytical skills to troubleshoot issues and optimize data processes, working independently and collaboratively. Experience in leading and managing a team/pod of engineers, with a proven track record of successful project delivery. Preferred qualifications, capabilities, and skills Knowledge of machine learning model lifecycle, language models and cloud-native MLOps pipelines and frameworks is a plus. Familiarity with data visualization tools and data integration patterns.

Posted 5 days ago

Apply

8.0 - 13.0 years

30 - 35 Lacs

Hyderabad

Work from Office

Career Category Information Systems Job Description Join Amgen s Mission of Serving Patients At Amgen, if you feel like you re part of something bigger, it s because you are. Our shared mission to serve patients living with serious illnesses drives all that we do. Since 1980, we ve helped pioneer the world of biotech in our fight against the world s toughest diseases. With our focus on four therapeutic areas Oncology, Inflammation, General Medicine, and Rare Disease we reach millions of patients each year. As a member of the Amgen team, you ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do Job Description As a Data Engineer, you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree / Masters degree / Bachelors degree and 8 to 13 years of Computer Science, IT or related field experience. Must-Have Skills: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), Snowflake, workflow orchestration, performance tuning on big data processing Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc. ), CI/CD (Jenkins, Maven etc. ), automated unit testing, and Dev Ops Proficient in SQL, Python for extracting, transforming, and analyzing complex datasets from relational data stores Proficient in Python with strong experience in ETL tools such as Apache Spark and various data processing packages, supporting scalable data workflows and machine learning pipeline development. Strong understanding of data modeling, data warehousing, and data integration concepts Proven ability to optimize query performance on big data platforms Knowledge on Data visualization and analytics tools like Spotfire, PowerBI Preferred Qualifications: Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Strong knowledge on Oracle / SQL Server, Stored Procedure, PL/SQL, Knowledge on LINUX OS Experience in implementing Retrieval-Augmented Generation (RAG) pipelines, integrating retrieval mechanisms with language models. Skilled in developing machine learning models using Python, with hands-on experience in deep learning frameworks including PyTorch and TensorFlow. Strong understanding of data governance frameworks, tools, and best practices. Knowledge of vector databases, including implementation and optimization. Knowledge of data protection regulations and compliance requirements (e. g. , GDPR, CCPA) Professional Certifications: Databricks Certificate preferred AWS Data Engineer/Architect Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers. amgen. com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 5 days ago

Apply

12.0 - 17.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Career Category Information Systems Job Description Join Amgen s Mission of Serving Patients At Amgen, if you feel like you re part of something bigger, it s because you are. Our shared mission to serve patients living with serious illnesses drives all that we do. Since 1980, we ve helped pioneer the world of biotech in our fight against the world s toughest diseases. With our focus on four therapeutic areas Oncology, Inflammation, General Medicine, and Rare Disease we reach millions of patients each year. As a member of the Amgen team, you ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Senior Manager - Clinical Data Hub Team What you will do Let s do this. Let s change the world. In this vital role you will lead an Agile product squad and responsible for defining the vision & strategy and implementation for a range of Clinical Data products supporting Amgen Clinical Trial Design & Analytics. You will collaborate closely with statisticians, data scientists, data engineers, and AI/ ML engineers teams to understand business needs, identify system enhancements, and drive system implementation projects. Your extensive experience in business analysis, system design, and project management will enable you to deliver innovative and effective technology products. Roles & Responsibilities : Define and communicate the product feature vision, including both technical / architectural features and enablement, and end-user features, ensuring alignment with business objectives across multiple solution collaborator groups Create, prioritize, and maintain the feature backlog, ensuring that it reflects the needs of the business and collaborators Collaborate with collaborators to gather and document product requirements, user stories, and acceptance criteria Work closely with the business teams, Scrum Master and development team to plan and implement sprints, ensuring that the highest priority features are delivered Oversee the day-to-day management of technology platforms, ensuring that they meet performance, security, and availability requirements Ensure that platforms comply with security standards, regulatory requirements, and organizational policies Assure that AIN team is successfully creating robust written materials, including product documentation, product backlog and user stories, and creating other need artifacts to assure efficient and effective coordination across time zones. Oversee the resolution of service-related incidents and problems, ensuring minimal impact on business operations Maintains in-depth knowledge of clinical development business domains with an emphasis in data assets and data pipelines, as well as an understanding of the multi-functional dependencies. Analyze customer feedback and support data to identify pain points and opportunities for product improvement What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree / Masters degree / Bachelors degree and 12 to 17 years of experience in Computer Science, IT or related field of experience A solid foundation in modern software design and engineering practices and business analysis. Proven experience in undemanding and gather business requirements and delivered insight, and achieved concrete business outcome. Technical Proficiency: Good understanding of the following technologies: Python, R, AI/ML frameworks, relational databases/data modeling, AWS services ( EC2, S3, Lambda, ECS, IAM), Docker and CI/CD/Gitlab, Apache/Databricks, Expert understanding and experience of clinical development process within Life Sciences (global clinical trial data sources, SDTM & AdaM, end-to-end clinical data design and analysis pipeline, clinical data security and governance) Experience in Agile product development as a participating member of a scrum team and related ceremonies and processes Ability to collaborate with data scientists and data engineers to deliver functional business requirements as well defining product roadmap. High learning agility, demonstrated ability of quickly grasp ever changing technology and clinical development domain knowledge and applied to the project work. Strong communications skills in writing, speaking, presenting and time management skills. Preferred Qualifications: Training or education degree in Computer Science, Biology, or Chemistry. Experience with Clinical Data and CDISC (SDTM and ADaM) standard Soft Skills: Excellent analytical and troubleshooting skills Deep intellectual curiosity, particularly about data patterns, and learning about business processes and life of the user Highest degree of initiative and self-motivation Strong verbal and written communication skills, including presentation of varied audiences through complex technical/business topics Confidence in leading teams through prioritization and sequencing discussions, including managing collaborator expectations Ability to work effectively with global, virtual teams, specifically including leveraging of tools and artifacts to assure clear and efficient collaboration across time zones Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem solving, analytical skills; Ability to learn quickly and retain and synthesize complex information from diverse sources What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers. amgen. com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 5 days ago

Apply

5.0 - 9.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Career Category Information Systems Job Description Join Amgen s Mission of Serving Patients At Amgen, if you feel like you re part of something bigger, it s because you are. Our shared mission to serve patients living with serious illnesses drives all that we do. Since 1980, we ve helped pioneer the world of biotech in our fight against the world s toughest diseases. With our focus on four therapeutic areas Oncology, Inflammation, General Medicine, and Rare Disease we reach millions of patients each year. As a member of the Amgen team, you ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do Job Description As a Sr. Associate BI Engineer, you will support the development and delivery of data-driven solutions that enable business insights and operational efficiency. You will work closely with senior data engineers, analysts, and stakeholders to support and build dashboards, analyze data, and contribute to the design of scalable reporting systems. This is an ideal role for early-career professionals looking to grow their technical and analytical skills in a collaborative environment Roles & Responsibilities: Designing and maintaining dashboards and reports using tools like Spotfire , Power BI, Tableau. Perform data analysis to identify trends and support business decisions. Gather BI requirements and translate them into technical specifications. Support data validation, testing, and documentation efforts. Apply best practices in data modeling, visualization, and BI development. Participate in Agile ceremonies and contribute to sprint planning and backlog grooming What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree / Bachelors degree and 5 to 9 years of Computer Science, IT or related field experience. Must-Have Skills: Must have strong knowledge of Spotfire. Exposure to other data visualization tools such as Power BI, Tableau, or QuickSight. Proficiency in SQL and scripting languages (e. g. , Python) for data processing and analysis Familiarity with data modeling, warehousing, and ETL pipelines Understanding of data structures and reporting concepts Strong analytical and problem-solving skills Preferred Qualifications: Familiarity with Cloud services like AWS (e. g. , Redshift, S3, EC2, IAM ), Databricks (Deltalake, Unity catalog, token etc) Understanding of Agile methodologies (Scrum, SAFe) Knowledge of DevOps, CI/CD practices Familiarity with scientific or healthcare data domains Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers. amgen. com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 5 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies