Jobs
Interviews

14 Dimensional Modelling Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As an experienced Data Modeler with 5+ years of proven experience in complex data environments, you will be responsible for designing and developing scalable, efficient, and future-proof data models to support reporting, analytics, and data integration initiatives at AceNet Consulting. Your expertise in Azure Fabrics, Azure Data Lake, and Azure Data Factory for ETL processes will be crucial in integrating data from source systems such as Oracle E-Business Suite/Fusion and Microsoft CRM into Azure platforms. Your key responsibilities will include designing and developing data models (conceptual, logical, and physical), collaborating with data engineers for data pipelines, leveraging Azure Fabrics/Data Lake for data storage and processing, ensuring data quality and governance, and optimizing data models for performance and scalability. You will also be required to develop and enforce best practices for data architecture, ETL processes, and data governance, as well as perform data profiling and analysis to identify and resolve data quality issues. To excel in this role, you must possess strong problem-solving skills, excellent communication abilities to interact effectively with technical and non-technical stakeholders, and familiarity with data warehousing concepts, star/snowflake schemas, and dimensional modeling. Additionally, experience with tools such as Azure Purview, Power BI, Azure Synapse Analytics, and Databricks will be advantageous. Microsoft Azure certifications and familiarity with Agile methodologies and CI/CD practices for data projects are considered a plus. Joining our dynamic team at AceNet Consulting will provide you with opportunities to work on transformative projects, cutting-edge technology, and innovative solutions with leading global firms across various industry sectors. We offer continuous investment in employee growth and professional development, competitive compensation and benefits, ESOPs, international assignments, a supportive environment with a focus on work-life balance and employee well-being, and an open culture that values diverse perspectives and encourages transparent communication. If you believe you are the ideal candidate for this position and are passionate about technology and thrive in a fast-paced environment, we encourage you to apply by submitting your resume.,

Posted 23 hours ago

Apply

3.0 - 10.0 years

0 Lacs

karnataka

On-site

About GlobalFoundries: GlobalFoundries is a leading full-service semiconductor foundry providing a unique combination of design, development, and fabrication services to some of the world's most inspired technology companies. With a global manufacturing footprint spanning three continents, GlobalFoundries makes possible the technologies and systems that transform industries and give customers the power to shape their markets. For more information, visit www.gf.com. The Data Solutions Group at GlobalFoundries is responsible for integrating manufacturing and engineering data from a wide variety of source systems used in the semiconductor engineering and production process. The data warehouse solutions developed by the group are utilized in the GLOBALFOUNDRIES Fabs in Dresden, the US, and Singapore. The Data Solutions Group conceptualizes and delivers timely, high-quality solutions that cater to the analysis needs of engineers in a leading-edge semiconductor foundry. Your Responsibilities: - Understand the business case and translate it into a holistic solution involving Ab Initio (Cloud/On-prem), Python, Data Ingestion, and Cloud DB Redshift / Postgres on AWS cloud. - Execute PL/SQL development for handling high volume data sets. - Prepare data warehouse design artifacts based on specified requirements, including ETL framework design, data modeling, and source-target mapping. - Monitor database queries for tuning and optimization opportunities. - Demonstrate problem-solving skills, familiarity with root cause analysis methods, and experience in documenting identified problems and resolutions. - Provide recommendations for enhancements and improvements in database management practices. - Offer consulting, interfacing, and standards related to database management while monitoring transaction activity and utilization. - Conduct analysis and tuning for performance issues. - Engage in data warehouse design and development, encompassing logical and physical schema design. Other Responsibilities: - Ensure all activities are carried out in a safe and responsible manner, adhering to Environmental, Health, Safety & Security requirements and programs. - Maintain a customer/stakeholder focus to build strong relationships with Application teams, cross-functional IT, and global/local IT teams. Required Qualifications: - Bachelor's or master's degree in information technology, Electrical Engineering, or related fields. - Minimum 3 years of proven experience in ETL development, design, performance tuning, and optimization. - Strong knowledge of data warehouse architecture approaches, trends, and a keen interest in applying and enhancing that knowledge, including understanding of Dimensional Modelling and ERD design approaches. - Working experience in Kubernetes and Docker Administration is advantageous. - Proficiency in AWS services, Ab Initio, Big data, Python, and Cloud DB RedShift. - Familiarity with PySpark is a plus. - Proficiency in SQL and PL/SQL. - Excellent conceptual abilities paired with strong technical documentation skills. - Familiarity with SDLC concepts and processes. Additional Skills: - Experience in using and developing on AWS services. - Experience in the semiconductor industry. - Knowledge of Semistructured datasets. - Experience with reporting data solutions and business intelligence tools. - Experience in collecting, structuring, and summarizing requirements in a data warehouse environment. - Knowledge of statistical data analysis and data mining. - Experience in test management, test case definition, and test processes. Preferred Qualifications: - Bachelor's or master's degree with a minimum of 10 years of relevant experience, including people management exposure. - Experience with AWS Cloud Services, Ab Initio, Python, Cloud DB Redshift / Postgres, and Data Ingestion. - Experience in preparing Data Warehouse design (ETL framework design, data modeling, source-target-mapping). GlobalFoundries is an equal opportunity employer, fostering a diverse and inclusive workforce. We believe that a multicultural workplace enhances productivity, efficiency, and innovation, while ensuring that our employees feel respected, valued, and heard. All employment offers with GlobalFoundries are subject to the successful completion of background checks, medical screenings as applicable, and compliance with respective local laws and regulations. To maintain a safe and healthy workplace for our employees at GlobalFoundries, candidates offered employment in India must be fully vaccinated before their targeted start date. The appointment for new hires is contingent upon providing a copy of their COVID-19 vaccination document, subject to any written requests for medical or religious accommodation.,

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

eClinical Solutions helps life sciences organizations around the world accelerate clinical development initiatives with expert data services and the elluminate Clinical Data Cloud the foundation of digital trials. Together, the elluminate platform and digital data services give clients self-service access to all their data from one centralized location plus advanced analytics that help them make smarter, faster business decisions. The Senior Software Developer plays a crucial role in collaborating with the Product Manager, Implementation Consultants (ICs), and clients to understand requirements for meeting data analysis needs. This position requires good collaboration skills to provide guidance on analytics aspects to the team in various analytics-related activities. Experienced in Qlik Sense Architecture design and proficient in load script implementation and best practices. Hands-on experience in Qlik Sense development, dashboarding, data-modelling, and reporting techniques. Skilled in data integration through ETL processes from various sources and adept at data transformation including the creation of QVD files and set analysis. Capable of data modeling using Dimensional Modelling, Star schema, and Snowflake schema. The Senior Software Developer should possess strong SQL skills, particularly in SQL Server, to validate Qlik Sense dashboards and work on internal applications. Knowledge of deploying Qlik Sense applications using Qlik Management Console (QMC) is advantageous. Responsibilities include working with ICs, product managers, and clients to gather requirements, configuration, migration, and support of Qlik Sense applications, implementation of best practices, and staying updated on new technologies. Candidates for this role should hold a Bachelor of Science / BTech / MTech / Master of Science degree in Computer Science or equivalent work experience. Effective verbal and written communication skills are essential. Additionally, candidates are required to have a minimum of 3 - 5 years of experience in implementing end-to-end business intelligence using Qlik Sense, with thorough knowledge of Qlik Sense architecture, design, development, testing, and deployment processes. Understanding of Qlik Sense best practices, relational database concepts, data modeling, SQL code writing, and ETL procedures is crucial. Technical expertise in Qlik Sense, SQL Server, data modeling, and experience with clinical trial data and SDTM standards is beneficial. This position offers the opportunity to accelerate skills and career growth within a fast-growing company while contributing to the future of healthcare. eClinical Solutions fosters an inclusive culture that values diversity and encourages continuous learning and improvement. The company is an equal opportunity employer committed to making employment decisions based on qualifications, merit, culture fit, and business needs.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As an ADW Senior Data Engineer, you will be responsible for providing prompt and effective support, maintenance, and development on OBIA based Analytics Datawarehouse using Oracle Data Integrator (ODI) as the underlying ETL Tool. Your role involves implementation, development, and maintenance of the ODI environment, including data warehouse design, dimensional modeling, ETL development & support, and ETL performance tuning. Your primary responsibilities will include solution design, implementation, migration, and support in Oracle BI Tool stack, especially ODI and SQL. You will be involved in ODI development in OBIA Environment, enhancements, support, and performance tuning of SQL programs. Additionally, you will work on data warehouse design, development, and maintenance using Star Schema (Dimensional Modeling). You will be responsible for production support of daily running ETL loads, monitoring, troubleshooting failures, and bug fixing across environments. Experience in Oracle BI Analytics Warehouse Methodology & Star Schema will be essential, along with working with different data sources such as Oracle, CRM, Cloud, Flat Files, Sharepoint, and other non-Oracle systems. Performance tuning of mappings in ODI and SQL query tuning will also be part of your expertise. Your expertise in data warehousing concepts like SCDs, Dimensional Modeling, Archive strategy, Aggregation, Hierarchy, and database concepts like Partitioning, Materialized views will be crucial. Migration and other deployment activities in Oracle BI tool stack (ODI), Kintana, and PVCS are also within your scope of responsibilities. Working knowledge of OBIEE, strong Oracle database experience, and understanding of BI/data warehouse analysis, design, development & testing are required. You should have a strong understanding of Change Management Processes and basic knowledge of SBM and ServiceNow. To excel in this role, you should have 5-7+ years of relevant experience working in OBIA on ODI as the ETL tool in BIAPPS environment. Strong written and oral communication skills, ability to work in a demanding user environment, and knowledge of tools like Serena Business Manager and ServiceNow are essential. Coordinating among various teams, working with Project Managers, designing and improving BI processes, and readiness to work in 24*7 environments are key aspects of this position. The required qualification for this role is B.Tech / MCA, and the desired competencies include being tech-savvy, effective communication, optimizing work processes, cultivating innovation, and being a good team player.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As an ETL Data Engineer, you will be responsible for leveraging your advanced database skills and extensive experience in T-SQL, performance tuning, and optimization to streamline data integration and transformation processes using ETL tools such as SSIS. With 5-7 years of experience, you will have a deep understanding of data warehousing, business intelligence, and dimensional modeling principles. Your role will involve managing multiple deliverables and projects simultaneously, showcasing your ability to multitask effectively. You will be expected to participate in initial profile screenings and progress through L1 and L2 level technical interview rounds, with the L2 interview being a face-to-face discussion in Bangalore. To excel in this role, you must possess excellent communication and collaboration skills, enabling you to work efficiently within a team. Additionally, you should be proficient in comprehensive data warehousing and business intelligence, ensuring the optimization of data processes. Please note that PAN and Date of Birth details are mandatory for application submission, and your CV should be in MS Word format. If you are based in Bangalore, you will have the opportunity to participate in an in-person L2 interview discussion. If you meet the above requirements and are enthusiastic about utilizing your expertise in ETL data engineering, we encourage you to apply for this exciting opportunity.,

Posted 4 days ago

Apply

9.0 - 12.0 years

9 - 12 Lacs

Chandigarh, India

Remote

Job Summary If you are looking for an opportunity in Technology Solutions and Development, Emerson has this exciting role for you! The Senior ETL Developer - Oracle will be part of a team of individuals whose responsibility is to develop ETL programs and to improve the performance of poorly written or poorly performing applications code in Oracle Data Integrator tool. This will include existing code and new code which have not yet been promoted to production. This team delivers technology solutions for strategic business needs, drives adoption of these services and support processes and boosts value by enhancing our customers experience. This role works along a hardworking and dedicated team of self-motivated professionals who share a collective passion for progress and excellence. In this Role, Your Responsibilities Will Be: Candidate will be responsible to design end to end solutions to cater to business needs of data in Oracle BI Tool stack especially ODI and SQL. Expertise in designing custom Data warehouse solutions specific to business needs. Expertise in Dimensional Modelling (using Star / Snowflake schema) and drafting High Level and Low Level DWH Design Expertise in preparing Data lineage sheets. Expertise in Data warehousing concepts like SCDs, Dimensional Modelling, Archive strategy, Aggregation, Hierarchy etc and Database concepts like Partitioning, Materialized views etc Expertise in ODI development (in BIAPPS Environment) & performance tuning of SQL programs. Experience of automating various ETL jobs, failure notification etc Ability to review and suggest DWH Design optimization solutions. Expertise in performance tuning of mappings in ODI and SQL query tuning. Production Support of Daily running ETL loads, monitoring, troubleshooting failures and bug fixing across environments Expertise in Multiple databases (Oracle/ SQL Server/etc) including complex SQL, query debug and optimization Should have strong understanding of Business Analytics, data warehouse analysis, design, development & testing Who You Are: You show a tremendous amount of initiative in tough situations; are exceptional at spotting and seizing opportunities. You observe situational and group dynamics and select best-fit approach. You make implementation plans that allocate resources precisely. You pursue everything with energy, drive, and the need to finish. For This Role, You Will Need: 8+ years of relevant experience working in OBIA on ODI as the ETL tool in BIAPPS environment. Exposure to working in other ETL tools. Ability to work on highly complex problems and provide feasible solutions in time. Ability to review and suggest improvements in existing DWH solution. Ability to work in a demanding user environment. Ability to provide training on ETL, DWH, and Dimensional modelling. Ability to guide and help team members in technical issues. Strong Written and Oral communication skills. Coordinating among various teams for day to day activity. Preferred Qualifications that Set You Apart: Bachelor's degree or equivalent in Science with a technical background (MIS, Computer Science, Engineering or any related field) Good interpersonal skills using English, both spoken and written, as will be working with overseas team Emerson's compensation and benefits programs are designed to be competitive within the industry and local labor markets. We also offer comprehensive medical and insurance coverage to meet the needs of our employees. We are committed to creating a global workplace that supports diversity, equity, and embraces inclusion. We welcome foreign nationals to join us through our Work Authorization Sponsorship. We have established our Remote Work Policy for eligible roles to promote Work-Life Balance through a hybrid work set up where our team members can take advantage of working both from home and at the office. Safety is paramount to us, and we are relentless in our pursuit to provide a Safe Working Environment across our global network and facilities.

Posted 5 days ago

Apply

5.0 - 10.0 years

0 Lacs

karnataka

On-site

As an Azure Architect at Fractal, you will be part of a dynamic team in the Artificial Intelligence space dedicated to empowering human decision-making in the enterprise. You will contribute to the development and delivery of innovative solutions that assist Fortune 500 companies in making strategic and tactical decisions. Fractal is renowned for its cutting-edge products such as Qure.ai, Cuddle.ai, Theremin.ai, and Eugenie.ai, which leverage AI to drive business success. Joining our Technology team, you will play a pivotal role in designing, building, and maintaining technology services for our global clientele. Your responsibilities will include actively participating in client engagements, developing end-to-end solutions for large projects, and utilizing Software Engineering principles to deliver scalable solutions. You will be instrumental in enhancing our technology capabilities to ensure successful project delivery. To excel in this role, you should hold a bachelor's degree in Computer Science or a related field and possess 5-10 years of experience in technology. Your expertise should span System Integration, Application Development, and Data-Warehouse projects, encompassing a variety of enterprise technologies. Proficiency in object-oriented languages like Python and PySpark, as well as relational and dimensional modeling, is essential. Additionally, a strong command of Microsoft Azure components such as Azure DataBricks, Azure Data Factory, and Azure SQL is mandatory. We are seeking individuals with a forward-thinking mindset and a proactive approach to problem-solving. If you are passionate about leveraging cloud technology and machine learning to drive business innovation, and if you thrive in a collaborative environment with like-minded professionals, we invite you to explore a rewarding career at Fractal. If you are ready to embrace challenges, foster growth, and collaborate with a team of high-performing individuals, we look forward to discussing how you can contribute to our mission of transforming decision-making processes with AI technology. Join us on this exciting journey towards shaping the future of enterprise decision-making!,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As an Oracle Data Integrator (ODI) Professional at YASH Technologies, you will play a key role in providing prompt and effective support, maintenance, and development on OBIA based Analytics Datawarehouse using ODI as the underlying ETL Tool. Your responsibilities will include implementation, development, and maintenance of the ODI environment, data warehouse design, dimensional modeling, ETL development & support, as well as ETL performance tuning. You will be responsible for solution design, implementation, migration, and support in the Oracle BI Tool stack, particularly ODI and SQL. Your tasks will involve ODI development in an OBIA Environment, enhancements, support, and performance tuning of SQL programs. Additionally, you will be involved in data warehouse design, development, and maintenance using Star Schema (Dimensional Modeling). Your role will also encompass production support of daily running ETL loads, monitoring, troubleshooting failures, bug fixing across environments, and working with various data sources including Oracle, CRM, Cloud, Flat Files, Sharepoint, and other non-Oracle systems. Experience in performance tuning of mappings in ODI and SQL query tuning will be essential. To succeed in this role, you should have 5-7+ years of relevant experience working in OBIA on ODI as the ETL tool in a BIAPPS environment. Strong written and oral communication skills, the ability to work in a demanding user environment, and knowledge of tools like Serena Business Manager and ServiceNow are crucial. A B.Tech / MCA qualification is required, and competencies such as being tech-savvy, effective communication, optimizing work processes, and cultivating innovation are essential. At YASH, you will have the opportunity to create a career path in an inclusive team environment that supports continuous learning and development. Our Hyperlearning workplace is grounded on principles such as flexible work arrangements, agile self-determination, trust, transparency, and stable employment with an ethical corporate culture. Join us at YASH Technologies and be part of a team that fosters positive changes in an ever-evolving virtual world.,

Posted 1 week ago

Apply

2.0 - 8.0 years

0 Lacs

maharashtra

On-site

You have a great opportunity as an Oracle Analytics Cloud Services (OACS) professional with 2 to 8 years of experience in IT. In this role, you will be responsible for developing and implementing Business Intelligence and Data warehousing solutions using OBIEE/OAC/OAS. Your tasks will include Analysis Design, Development, Customization, Implementation & Maintenance of OBIEE/OAS/OAC. To excel in this position, you must have prior experience in working with OAC, including security design and implementation. Proficiency in RPD development and dimensional modelling is essential, along with expertise in handling multiple fact table dimensional Modelling, facts of different grain levels, MUDE Environment, Hierarchies, Fragmentation, etc. Additionally, a good understanding of OAC DV is required. Knowledge in OAC/Oracle DV/FAW/OBIA & BI Publisher would be considered a plus. Strong SQL skills and the ability to debug queries are necessary for this role. You should also have expertise in front-end development, creating OBIEE/OAS/OAC reports and dashboards using different views. Experience in performance tuning of reports, including techniques such as Indexing, caching, aggregation, SQL modification, hints, etc., will be beneficial. Effective communication skills are vital, and you should be organized to deliver high-quality solutions using OBIEE/OAS/OAC. This position is available in multiple locations including Mumbai, Pune, Kolkata, Chennai, Coimbatore, Delhi, and Bangalore. If you are ready to showcase your skills and contribute to a dynamic team, please reach out to komal.sutar@ltimindtree.com for further details.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have a minimum of 8 years of experience as a Power BI Developer with at least 7 years to 12 years of total experience. Your role will involve hands-on experience in handling teams and clients. You should possess expert knowledge in using advanced calculations in MS Power BI Desktop, including DAX languages such as Aggregate, Date, Logical, String, and Table functions. Prior experience in connecting Power BI with both on-premise and cloud computing platforms is required. A deep understanding and the ability to utilize and explain various aspects of relational database design, multidimensional database design, OLTP, OLAP, KPIs, Scorecards, and Dashboards are essential for this role. You should have a very good understanding of Data Modeling Techniques for Analytical Data, including Facts, Dimensions, and Measures. Experience in data warehouse design, specifically dimensional modeling, and data mining will be beneficial for this position. Additionally, hands-on experience in SSIS, SSRS, and SSAS will be considered a plus.,

Posted 2 weeks ago

Apply

2.0 - 8.0 years

0 Lacs

maharashtra

On-site

You have a job opportunity as an Oracle Analytics Cloud Services (OACS) professional with 2 to 8 years of experience. You should have 3-8 years of IT experience focusing on the development and implementation of Business Intelligence and Data warehousing solutions using OBIEE/OAC/OAS. It is essential to possess knowledge in Analysis Design, Development, Customization, Implementation & Maintenance of OBIEE/OAS/OAC. Your role will involve working in Oracle Analytics Cloud (OAC) including security design and implementation. You must have expertise in RPD development and dimensional modeling, with a focus on handling multiple fact table dimensional modeling, facts of different grain levels, MUDE Environment, Hierarchies, Fragmentation, among others. Additionally, you should have good knowledge in OAC DV. Knowledge in OAC/Oracle DV/FAW/OBIA & BI Publisher is considered a plus. Proficiency in writing SQL and debugging queries is crucial for this role. You should also possess strong skills in front-end development, creating OBIEE/OAS/OAC reports and dashboards using various views. Experience in performance tuning of reports through techniques like Indexing, caching, aggregation, SQL modification, hints, etc., is required. Excellent communication skills, organization, and the ability to deliver high-quality solutions using OBIEE/OAS/OAC are essential for this position. The job location options include Mumbai, Pune, Kolkata, Chennai, Coimbatore, Delhi, and Bangalore. For more details or to apply, please reach out to komal.sutar@ltimindtree.com.,

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Lead Consultant - Snowflake & Informatica Cloud Data Engineer. Responsibilities Design and implement scalable data solution in Snowflake following the data engineering best practices and layered architecture. Design and implement scalable data pipelines and ETL/ELT processes using dbt, integrated with Snowflake for modern cloud data warehousing Develop and optimize transformation logic and storage structures in Snowflake using SQL, Python, and Airflow Collaborate with business and technical teams to translate data requirements into robust dbt on Snowflake integration solutions Ensure data quality, security, and compliance by applying governance best practices across data transformation pipelines and within the Snowflake environments Perform performance tuning in Snowflake and streamline ETL pipelines for efficient execution, supported by clear documentation of architecture and integration patterns Qualifications we seek in you! Minimum Qualifications Bachelor%27s degree in information science, data management, computer science or related field preferred Must have experience in Cloud Data Engineering domain Proven experience in cloud data engineering using Snowflake and Informatica, with hands-on delivery of end-to-end data pipeline implementations Strong knowledge of data warehousing, ELT/ETL design, OLAP concepts, and dimensional modelling using Snowflake, with experience in projects delivering complete data solutions Hands-on expertise in developing, scheduling, and orchestrating scalable ETL/ELT pipelines using Informatica Cloud or PowerCenter Proficiency in Python for data transformation and automation tasks integrated with Snowflake environments Excellent communication and documentation skills, with the ability to clearly articulate Snowflake architectures and Informatica workflows Experience implementing data quality, lineage, and governance frameworks using Informatica and Snowflake capabilities Familiarity with CI/CD practices for deploying Informatica workflows and Snowflake objects within DevOps environments Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

8 - 13 Lacs

Gurgaon, Haryana, India

On-site

In-depth understanding of data warehouse design including dimensional modelling, data vault, star schema, and snowflake schema. Make or adjust designs on the fly with proper justification. Design data models using tools such as Visio or Erwin. Prepare mapping documents and present to ETL and testing teams for implementation. Interact with other architects and business stakeholders to gather, validate requirements, and obtain signoffs. Manage multiple initiatives and projects from start to finish. Maximize value derived from data and analytics. Prioritize data needs and surface critical insights. Enable enterprise data management and ensure data security. Improve performance of Datahub and analytic assets. Provide expertise in data architecture and solution design. Role Requirements and Qualifications: Minimum of 8 years of IT experience, with at least 3 years in information system design or data architecture. Degree in Computer Science, Information Systems, or a related field preferred. Extensive experience in designing and implementing information solutions and database structures. Strong leadership and people management skills, with a track record of motivating teams across cultures. Hands-on experience with data and analytics management programs is a plus. Ability to analyze project and program needs and mobilize cross-functional resources. Strategic thinking, problem-solving, emotional intelligence, cultural awareness, and resilience. Ethical team player with a drive to innovate, deliver, and build strong partnerships. Excellent communicator capable of influencing stakeholders at all levels.

Posted 1 month ago

Apply

7.0 - 10.0 years

20 - 25 Lacs

Pune

Work from Office

Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expertlevel SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelors in Computer Science, Engineering, Information Systems (Masters preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies