Home
Jobs

33 Denormalization Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Position: Are you a passionate backend engineer looking to make a significant impact? Join our cross-functional, distributed team responsible for building and maintaining the core backend functionalities that power our customers. You’ll be instrumental in developing scalable and robust solutions, directly impacting on the efficiency and reliability of our platform. This role offers a unique opportunity to work on cutting-edge technologies and contribute to a critical part of our business, all within a supportive and collaborative environment. Role: Junior .Net Engineer Location: Hyderabad Experience: 3 to 5 years Job Type: Full Time Employment What You'll Do: Implement feature/module as per design and requirements shared by Architect, Leads, BA/PM using coding best practices Develop, and maintain microservices using C# and .NET Core perform unit testing as per code coverage benchmark. Support testing & deployment activities Micro-Services - containerized micro-services (Docker/Kubernetes/Ansible etc.) Create and maintain RESTful APIs to facilitate communication between microservices and other components. Analyze and fix defects to develop high standard stable codes as per design specifications. Utilize version control systems (e.g., Git) to manage source code. Requirement Analysis: Understand and analyze functional/non-functional requirements and seek clarifications from Architect/Leads for better understanding of requirements. Participate in estimation activity for given requirements. Coding and Development: Writing clean and maintainable code using best practices of software development. Make use of different code analyzer tools. Follow TTD approach for any implementation. Perform coding and unit testing as per design. Problem Solving/ Defect Fixing: Investigate and debug any defect raised. Finding root causes, finding solutions, exploring alternate approaches and then fixing defects with appropriate solutions. Fix defects identified during functional/non-functional testing, during UAT within agreed timelines. Perform estimation for defect fixes for self and the team. Deployment Support: Provide prompt response during production support Expertise You'll Bring: Language – C# Visual Studio Professional Visual Studio Code .NET Core 3.1 onwards Entity Framework with code-first approach Dependency Injection Error Handling and Logging SDLC Object-Oriented Programming (OOP) Principles SOLID Principles Clean Coding Principles Design patterns API Rest API with token-based Authentication & Authorization Postman Swagger Database Relational Database: SQL Server/MySQL/ PostgreSQL Stored Procedures and Functions Relationships, Data Normalization & Denormalization, Indexes and Performance Optimization techniques Preferred Skills Development Exposure to Cloud: Azure/GCP/AWS Code Quality Tool – Sonar Exposure to CICD process and tools like Jenkins etc., Good understanding of docker and Kubernetes Exposure to Agile software development methodologies and ceremonies Benefits: Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment: Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a value-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry’s best Let’s unleash your full potential at Persistent “Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind.” Show more Show less

Posted 5 hours ago

Apply

8.0 years

0 Lacs

India

Remote

Linkedin logo

Job title : Data Engineer Experience: 5–8 Years Location: Remote Shift: IST (Indian Standard Time) Contract Type: Short-Term Contract Job Overview We are seeking an experienced Data Engineer with deep expertise in Microsoft Fabric to join our team on a short-term contract basis. You will play a pivotal role in designing and building scalable data solutions and enabling business insights in a modern cloud-first environment. The ideal candidate will have a passion for data architecture, strong hands-on technical skills, and the ability to translate business needs into robust technical solutions. Key Responsibilities Design and implement end-to-end data pipelines using Microsoft Fabric components (Data Factory, Dataflows Gen2). Build and maintain data models , semantic layers , and data marts for reporting and analytics. Develop and optimize SQL-based ETL processes integrating structured and unstructured data sources. Collaborate with BI teams to create effective Power BI datasets , dashboards, and reports. Ensure robust data integration across various platforms (on-premises and cloud). Implement mechanisms for data quality , validation, and error handling. Translate business requirements into scalable and maintainable technical solutions. Optimize data pipelines for performance and cost-efficiency . Provide technical mentorship to junior data engineers as needed. Required Skills Hands-on experience with Microsoft Fabric : Dataflows Gen2, Pipelines, OneLake. Strong proficiency in Power BI , including semantic modeling and dashboard/report creation. Deep understanding of data modeling techniques: star schema, snowflake schema, normalization, denormalization. Expertise in SQL , stored procedures, and query performance tuning. Experience integrating data from diverse sources: APIs, flat files, databases, and streaming. Knowledge of data governance , lineage, and data catalog tools within the Microsoft ecosystem. Strong problem-solving skills and ability to manage large-scale data workflows. Show more Show less

Posted 10 hours ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Spendflo is a fast-growing Series A startup helping companies streamline how they procure, manage, and optimize their software and services. Backed by top-tier investors, were building the most intelligent, automated platform for procurement operations. We are now looking for a Senior Data Engineer to design, build, and scale our data infrastructure. Youll be the backbone of all data movement at Spendflo from ingestion to transformation to Youll Do : Design, implement, and own the end-to-end data architecture at Spendflo. Build and maintain robust, scalable ETL/ELT pipelines across multiple sources and systems. Develop and optimize data models for analytics, reporting, and product needs. Own the reporting layer and work with PMs, analysts, and leadership to deliver actionable data. Ensure data quality, consistency, and lineage through validation and monitoring. Collaborate with engineering, product, and data science teams to build seamless data flows. Optimize data storage and query performance for scale and speed. Own documentation for pipelines, models, and data flows. Stay current with the latest data tools and bring in the right technologies. Mentor junior data engineers and help establish data best Qualifications : 5+ years of experience as a data engineer, preferably in a product/startup environment . Strong expertise in building ETL/ELT pipelines using modern frameworks (e.g., Dagster, dbt, Airflow). Deep knowledge of data modeling (star/snowflake schemas, denormalization, dimensional modeling). Hands-on with SQL (advanced queries, performance tuning, window functions, etc.). Experience with cloud data warehouses like Redshift, BigQuery, Snowflake, or similar. Comfortable working with cloud platforms (AWS/GCP/Azure) and tools like S3, Lambda, etc. Exposure to BI tools like Looker, Power BI, Tableau, or equivalent. Strong debugging and performance tuning skills. Excellent communication and documentation Qualifications : Built or managed large-scale, cloud-native data pipelines. Experience with real-time or stream processing (Kafka, Kinesis, etc.). Understanding of data governance, privacy, and security best practices. Exposure to machine learning pipelines or collaboration with data science teams. Startup experience able to handle ambiguity, fast pace, and end-to-end ownership. (ref:hirist.tech) Show more Show less

Posted 15 hours ago

Apply

9.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Job Description Having 9+ years of working experience in Data Engineering and Data Analytic projects in implementing Data Warehouse, Data Lake and Lakehouse and associated ETL/ELT patterns. Worked as a Data Modeller in one or two implementations in creating and implementing Data models and Data Base designs using Dimensional, ER models. Good knowledge and experience in modelling complex scenario's like many to many relationships, SCD types, Late arriving fact and dimensions etc. Hands on experience in any one of the Data modelling tools like Erwin, ER/Studio, Enterprise Architect or SQLDBM etc. Experience in working closely with Business stakeholders/Business Analyst to understand the functional requirements and translating it into Data Models and database designs. Experience in creating conceptual models and logical models and translating it into physical models to address the both functional and non functional requirements. Strong knowledge in SQL, able to write complex queries and profile the data to understand the relationships and DQ issues. Very strong understanding of database modelling and design principles like normalization, denormalization, isolation levels. Experience in Performance optimizations through database designs (Physical Modelling). Good communication skills (ref:hirist.tech) Show more Show less

Posted 15 hours ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

JD For Data Modeler Key Requirements : Total 8 + years of Experience with 8 years of hands-on experience in data modelling Expertise in conceptual, logical, and physical data modeling Proficient in tools such as Erwin, SQL DBM, or similar Strong understanding of data governance and database design best practices Excellent communication and collaboration skills Having 8+ years of working experience in Data Engineering and Data Analytic projects in implementing Data Warehouse, Data Lake and Lakehouse and associated ETL/ELT patterns. Worked as a Data Modeller in one or two implementations in creating and implementing Data models and Data Base designs using Dimensional, ER models. Good knowledge and experience in modelling complex scenarios like many to many relationships, SCD types, Late arriving fact and dimensions etc. Hands on experience in any one of the Data modelling tools like Erwin, ER/Studio, Enterprise Architect or SQLDBM etc. Experience in working closely with Business stakeholders/Business Analyst to understand the functional requirements and translating it into Data Models and database designs. Experience in creating conceptual models and logical models and translating it into physical models to address the both functional and non functional requirements. Strong knowledge in SQL, able to write complex queries and profile the data to understand the relationships and DQ issues. Very strong understanding of database modelling and design principles like normalization, denormalization, isolation levels. Experience in Performance optimizations through database designs (Physical Modelling). (ref:hirist.tech) Show more Show less

Posted 15 hours ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Data Modeler JD  Proven experience as a Data Modeler or in a similar role (8 years depending on seniority level).  Proficiency in data modeling tools (e.g., ER/Studio, Erwin, SAP PowerDesigner, or similar).  Strong understanding of database technologies (e.g., SQL Server, Oracle, PostgreSQL, Snowflake).  Experience with cloud data platforms (e.g., AWS, Azure, GCP).  Familiarity with ETL processes and tools.  Excellent knowledge of normalization and denormalization techniques.  Strong analytical and problem-solving skills. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Proficiency in data modeling tools such as ER/Studio, ERwin or similar. Deep understanding of relational database design, normalization/denormalization, and data warehousing principles. Experience with SQL and working knowledge of database platforms like Oracle, SQL Server, PostgreSQL, or Snowflake. Strong knowledge of metadata management, data lineage, and data governance practices. Understanding of data integration, ETL processes, and data quality frameworks. Ability to interpret and translate complex business requirements into scalable data models. Excellent communication and documentation skills to collaborate with cross-functional teams. Show more Show less

Posted 5 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Key Responsibilities: Ab Initio Development & Optimization: Design, develop, test, and deploy high-performance, scalable ETL/ELT solutions using Ab Initio components (GDE, Co>Operating System, EME, Control Center). Translate complex business requirements and data transformation rules into efficient and maintainable Ab Initio graphs and plans. Optimize existing Ab Initio applications for improved performance, resource utilization, and reliability. Troubleshoot, debug, and resolve complex data quality and processing issues within Ab Initio graphs and systems. Data Modeling & Advanced SQL: Apply expertise in advanced SQL to write complex queries for data extraction, transformation, validation, and analysis across various relational databases (e.g., DB2, Oracle, SQL Server). Design and implement efficient relational data models (e.g., Star Schema, Snowflake Schema, 3NF) for data warehousing and analytics. Understand and apply big data modeling concepts (e.g., denormalization for performance, schema-on-read, partitioning strategies for distributed systems). Spark & Big Data Integration: Collaborate with data architects on data integration strategies in a hybrid environment, understanding how Ab Initio processes interact with or feed into big data platforms. Analyze and debug data flow issues that may span across traditional ETL and big data platforms (e.g., HDFS, Hive, Spark). Demonstrate strong foundational knowledge in Apache Spark, including understanding Spark SQL and DataFrame operations, to comprehend and potentially assist in debugging Spark-based pipelines. Collaboration & Documentation: Work effectively with business analysts, data architects, QA teams, and other developers to deliver high-quality data solutions. Create and maintain comprehensive technical documentation for Ab Initio graphs, data lineage, data models, and ETL processes. Participate in code reviews, design discussions, and contribute to best practices within the team. Required Skills & Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field. 5+ years of hands-on, in-depth development experience with Ab Initio GDE, Co>Operating System, and EME. Expert-level proficiency in SQL for complex data manipulation, analysis, and optimization across various relational databases. Solid understanding of relational data modeling concepts and experience designing logical and physical data models. Demonstrated proficiency or strong foundational knowledge in Apache Spark (Spark SQL, DataFrames) and familiarity with the broader Hadoop ecosystem (HDFS, Hive). Experience with Unix/Linux shell scripting. Strong understanding of ETL processes, data warehousing concepts, and data integration patterns. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication and collaboration skills, with the ability to work effectively in cross-functional teams Show more Show less

Posted 6 days ago

Apply

5.0 years

0 Lacs

Greater Chennai Area

On-site

Linkedin logo

Overview A Data Modeller is responsible for designing, implementing, and managing data models that support the strategic and operational needs of an organization. This role involves translating business requirements into data structures, ensuring consistency, accuracy, and efficiency in data storage and retrieval processes. Responsibilities Develop and maintain conceptual, logical, and physical data models. Collaborate with business analysts, data architects, and stakeholders to gather data requirements. Translate business needs into efficient database designs. Optimize and refine existing data models to support analytics and reporting. Ensure data models support data governance, quality, and security standards. Work closely with database developers and administrators on implementation. Document data models, metadata, and data flows. Requirements Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Science, or related field. Data Modeling Tools: ER/Studio, ERwin, SQL Developer Data Modeler, or similar. Database Technologies: Proficiency in SQL and familiarity with databases like Oracle, SQL Server, MySQL, PostgreSQL. Data Warehousing: Experience with dimensional modeling, star and snowflake schemas. ETL Processes: Knowledge of Extract, Transform, Load processes and tools. Cloud Platforms: Familiarity with cloud data services (e.g., AWS Redshift, Azure Synapse, Google BigQuery). Metadata Management & Data Governance: Understanding of data cataloging and governance principles. Strong analytical and problem-solving skills. Excellent communication skills to work with business stakeholders and technical teams. Ability to document models clearly and explain complex data relationships. 5+ years in data modeling, data architecture, or related roles. Experience working in Agile or DevOps environments is often preferred. Understanding of normalization/denormalization. Experience with business intelligence and reporting tools. Familiarity with master data management (MDM) principles. Show more Show less

Posted 1 week ago

Apply

6.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description: Data Modeler Primary Skill: Data Modelling, Database design, Erwin, Dimension Modelling SecondarySkill : Datalake, Lakhouse design, Datawarehouse design Location: Hyderabad Industry: Insurance Employment Type: Permanent Functional Area: Solutions & Delivery Experience: 6-8 Years Are you a seasoned Data Modeller with expertise in Data & Analytics Projects? Do you stay ahead of the curve with the latest technologies and eager to expand your knowledge? Do you thrive in dynamic, fast-paced environments and have a passion for delivering high-quality solutions? If so, we have an exciting opportunity for you! As a Data Modeler at ValueMomentum, you will be responsible for designing and implementing scalable, high-performance Modern Data & Analytics solutions in an agile environment. You will work closely with cross-functional teams to create reusable, testable, and sustainable data architectures that align with business needs. This role will directly impact the quality of data systems and analytics solutions, helping organizations unlock the full potential of their data. Why ValueMomentum? Headquartered in New Jersey, US, ValueMomentum is one of the fastest-growing software & solutions firms focused on the Healthcare, Insurance, and Financial Services domains. Our industry focus, expertise in technology backed by R&D, and our customer-first approach uniquely position us to deliver value and drive momentum for our customers’ initiatives. At ValueMomentum, we value continuous learning, innovation, and collaboration. As an MS Fabric Architect, you will have the opportunity to work with cutting-edge technologies and make a significant impact on our data-driven solutions. You will collaborate with a talented team of professionals, shape the future of data architecture, and contribute to the success of our clients in industries that are transforming rapidly. If you're ready to take your career to the next level, apply today to join our dynamic team and help us drive innovation in the world of Modern Data & Analytics! Key Responsibilities: Design and develop conceptual, logical, and physical data models meeting business requirements and strategic initiatives. Design, Develop and maintain Insurance Data models. Collaborate with business stakeholders and data engineers to understand data needs and translate them into effective data models. Analyze and evaluate existing data models and databases to identify opportunities for optimization, standardization, and improvement. Define data standards, naming conventions, and data governance policies to ensure consistency, integrity, and quality of data models. Develop and maintain documentation of data models, data dictionaries, STTM, and metadata to facilitate understanding and usage of data assets. Implement best practices for data modelling, including normalization, denormalization, indexing, partitioning, and optimization techniques. Work closely with database administrators to ensure proper implementation and maintenance of data models in database management systems. Stay abreast of industry trends, emerging technologies, and best practices in data modelling, database design, and data management. Must-have Skills: 6–8 years of hands-on experience in data modelling, including database design, dimensional modelling (star/snowflake schemas). Hands-on experience in implementing Data Models for Policy, Claims, and Finance subject areas within the Property & Casualty (P&C) Insurance domain. Proficiency in data modelling tools such as erwin, ER/Studio, or PowerDesigner. Strong SQL skills and experience working with relational databases, primarily SQL Server. Exposure to design principles and best practices for Data Lakes and Lakehouse architectures. Experience with big data platforms (e.g., Spark). Strong documentation skills including data dictionaries, STTM, ER models etc. Familiarity with data warehouse design principles, ETL processes, and data integration techniques. Knowledge of cloud-based data platforms and infrastructure. Nice-to-have Skills: Expertise in advanced data modelling techniques for Real-time/Near real-time data solutions Experience with NoSQL data modelling. Handson experience on any BI tool. Your Key Accountabilities: Collaborate with cross-functional teams to align data models with business and technical requirements. Define and enforce best practices for data modelling and database design. Provide technical guidance on database optimization and performance tuning. Draft technical guidelines, documentation, and data dictionaries to standardize data modelling practices. What We Offer: Career Advancement: Individual Career Development, coaching, and mentoring programs for professional and leadership skill development. Comprehensive training and certification programs. Performance Management: Goal setting, continuous feedback, and year-end appraisal. Reward & recognition for extraordinary performers. Benefits: Comprehensive health benefits, wellness, and fitness programs. Paid time off and holidays. Culture: A highly transparent organization with an open-door policy and a vibrant culture. a If you’re enthusiastic about Data & Analytics and eager to make an impact through your expertise, we invite you to join us. Apply now and become part of a team that's driving the future of data-driven decision-making! Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Job Description We are looking for a seasoned Data Engineer with 5–8 years of experience, specializing in Microsoft Fabric for our UK Based client. The ideal candidate will play a key role in designing, building, and optimizing scalable data pipelines and models. You will work closely with analytics and business teams to drive data integration, ensure quality, and support data-driven decision-making in a modern cloud environment. Key Responsibilities: Design, develop, and optimize end-to-end data pipelines using Microsoft Fabric (Data Factory, Dataflows Gen2). Create and maintain data models, semantic models, and data marts for analytical and reporting purposes. Develop and manage SQL-based ETL processes, integrating various structured and unstructured data sources. Collaborate with BI developers and analysts to develop Power BI datasets, dashboards, and reports. Implement robust data integration solutions across diverse platforms and sources (on-premises, cloud). Ensure data integrity, quality, and governance through automated validation and error handling mechanisms. Work with business stakeholders to understand data requirements and translate them into technical specifications. Optimize data workflows for performance and cost-efficiency in a cloud-first architecture. Provide mentorship and technical guidance to junior data engineers. Required Skills: Strong hands-on experience with Microsoft Fabric, including Dataflows Gen2, Pipelines, and OneLake. Proficiency in Power BI, including building reports, dashboards, and working with semantic models. Solid understanding of data modeling techniques: star schema, snowflake, normalization/denormalization. Deep experience with SQL, stored procedures, and query optimization. Experience in data integration from diverse sources such as APIs, flat files, databases, and streaming data. Knowledge of data governance, lineage, and data catalog capabilities within the Microsoft ecosystem. Strong problem-solving skills and experience in performance tuning of large datasets. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Key Responsibilities JOB DESCRIPTION Develop, optimize, and maintain complex SQL queries, stored procedures, functions, and views. Analyze slow-performing queries and optimize execution plans to improve database performance. Design and implement indexing strategies to enhance query efficiency. Work with developers to optimize database interactions in applications. Develop and implement Teradata best practices for large-scale data processing and ETL workflows. Monitor and troubleshoot Teradata performance issues using tools like DBQL (Database Query Log), Viewpoint, and Explain Plan Analysis. Perform data modeling, normalization, and schema design improvements. Collaborate with teams to implement best practices for database tuning and performance enhancement. Automate repetitive database tasks using scripts and scheduled jobs. Document database architecture, queries, and optimization techniques. Responsibilities Required Skills & Qualifications: Strong proficiency in Teradata SQL, including query optimization techniques. Strong proficiency in SQL (T-SQL, PL/SQL, or equivalent). Experience with indexing strategies, partitioning, and caching techniques. Knowledge of database normalization, denormalization, and best practices. Familiarity with ETL processes, data warehousing, and large datasets. Experience in writing and optimizing stored procedures, triggers, and functions. Hands-on experience in Teradata performance tuning, indexing, partitioning, and statistics collection. Experience with EXPLAIN plans, DBQL analysis, and Teradata Viewpoint monitoring. Candidate should have PowerBI / Tableau integration experience - Good to Have About Us ABOUT US Bristlecone is the leading provider of AI-powered application transformation services for the connected supply chain. We empower our customers with speed, visibility, automation, and resiliency – to thrive on change. Our transformative solutions in Digital Logistics, Cognitive Manufacturing, Autonomous Planning, Smart Procurement and Digitalization are positioned around key industry pillars and delivered through a comprehensive portfolio of services spanning digital strategy, design and build, and implementation across a range of technology platforms. Bristlecone is ranked among the top ten leaders in supply chain services by Gartner. We are headquartered in San Jose, California, with locations across North America, Europe and Asia, and over 2,500 consultants. Bristlecone is part of the $19.4 billion Mahindra Group. Equal Opportunity Employer Bristlecone is an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status . Information Security Responsibilities Understand and adhere to Information Security policies, guidelines and procedure, practice them for protection of organizational data and Information System. Take part in information security training and act while handling information. Report all suspected security and policy breach to InfoSec team or appropriate authority (CISO). Understand and adhere to the additional information security responsibilities as part of the assigned job role. Show more Show less

Posted 1 week ago

Apply

8.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Title: SAP Analytics Cloud Specialist Career Level : D2 Introduction to role Are you ready to make a significant impact in the world of analytics? Join AstraZeneca's Process Insights team within Global Business Services (GBS) as an SAP Analytics Cloud Specialist. We are on a mission to transform business processes through automation, analytics, and AI capabilities. As we scale our capabilities, you'll play a pivotal role in delivering SAP analytics solutions that drive progress across AstraZeneca. Accountabilities Collaborate with stakeholders to understand their business process requirements and objectives, translating them into SAP Analytics solutions (SAC & Datasphere). Create Extract, Transform, and Load (ETL) data pipelines, data warehousing, and testing. Validate and assure data quality and accuracy, including data cleansing, enrichment, and building data models. Develop comprehensive analytics and dashboards for business collaborators for reporting, business planning, and critical metric tracking purposes. Enhance solution experiences and visualizations using low/no-code development. Essential Skills/Experience Degree in Computer Science, Business Informatics or a comparable degree. Overall 8-10 years of experience and at least 2 years’ experience working on SAP SAC / Datasphere solutions as a Data Analyst and/or Data Engineer. Experience in SAP Datasphere, ETL, building data pipelines, preparing and integrating data, data modelling, understanding of relational data modelling and denormalization techniques. Experience in SAP Analytics Cloud in creating advanced analytics/dashboards i.e. stories, boardrooms, planning. Knowledge of analytics standard processes. Understanding of SAP related Finance and/or Operations processes will be valued. Certification in one or more of the following will be appreciated: SAC Data Analyst, Data Engineer, Low-Code/No-Code Developer. Good communication skills and ability to work in an Agile environment. Energetic, organized and self-motivated. Fluent in business English. Desirable Skills/Experience NA When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. AstraZeneca is a dynamic company where innovation is at the forefront of everything we do. Here, you can apply your skills to genuinely impact patients' lives while being part of a global team that drives excellence and breakthroughs. With a focus on digital transformation and leveraging radical technologies, we offer an environment where you can challenge norms, take ownership, and make quick decisions. Our commitment to sustainability and empowering our teams ensures that every action contributes to a greater purpose. Ready to take the next step in your career? Apply now and be part of our journey towards transforming healthcare through analytics! Show more Show less

Posted 2 weeks ago

Apply

8.0 - 10.0 years

5 - 8 Lacs

Chennai

On-site

Job ID R-226449 Date posted 05/23/2025 Job Title: SAP Analytics Cloud Specialist Career Level : D2 Introduction to role Are you ready to make a significant impact in the world of analytics? Join AstraZeneca's Process Insights team within Global Business Services (GBS) as an SAP Analytics Cloud Specialist. We are on a mission to transform business processes through automation, analytics, and AI capabilities. As we scale our capabilities, you'll play a pivotal role in delivering SAP analytics solutions that drive progress across AstraZeneca. Accountabilities Collaborate with stakeholders to understand their business process requirements and objectives, translating them into SAP Analytics solutions (SAC & Datasphere). Create Extract, Transform, and Load (ETL) data pipelines, data warehousing, and testing. Validate and assure data quality and accuracy, including data cleansing, enrichment, and building data models. Develop comprehensive analytics and dashboards for business collaborators for reporting, business planning, and critical metric tracking purposes. Enhance solution experiences and visualizations using low/no-code development. Essential Skills/Experience Degree in Computer Science, Business Informatics or a comparable degree. Overall 8-10 years of experience and at least 2 years’ experience working on SAP SAC / Datasphere solutions as a Data Analyst and/or Data Engineer. Experience in SAP Datasphere, ETL, building data pipelines, preparing and integrating data, data modelling, understanding of relational data modelling and denormalization techniques. Experience in SAP Analytics Cloud in creating advanced analytics/dashboards i.e. stories, boardrooms, planning. Knowledge of analytics standard processes. Understanding of SAP related Finance and/or Operations processes will be valued. Certification in one or more of the following will be appreciated: SAC Data Analyst, Data Engineer, Low-Code/No-Code Developer. Good communication skills and ability to work in an Agile environment. Energetic, organized and self-motivated. Fluent in business English. Desirable Skills/Experience NA When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. AstraZeneca is a dynamic company where innovation is at the forefront of everything we do. Here, you can apply your skills to genuinely impact patients' lives while being part of a global team that drives excellence and breakthroughs. With a focus on digital transformation and leveraging radical technologies, we offer an environment where you can challenge norms, take ownership, and make quick decisions. Our commitment to sustainability and empowering our teams ensures that every action contributes to a greater purpose. Ready to take the next step in your career? Apply now and be part of our journey towards transforming healthcare through analytics! AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements. SAP Analytics Cloud Specialist Posted date May. 23, 2025 Contract type Full time Job ID R-226449 APPLY NOW Why choose AstraZeneca India? Help push the boundaries of science to deliver life-changing medicines to patients. After 45 years in India, we’re continuing to secure a future where everyone can access affordable, sustainable, innovative healthcare. The part you play in our business will be challenging, yet rewarding, requiring you to use your resilient, collaborative and diplomatic skillsets to make connections. The majority of your work will be field based, and will require you to be highly-organised, planning your monthly schedule, attending meetings and calls, as well as writing up reports. Who do we look for? Calling all tech innovators, ownership takers, challenge seekers and proactive collaborators. At AstraZeneca, breakthroughs born in the lab become transformative medicine for the world's most complex diseases. We empower people like you to push the boundaries of science, challenge convention, and unleash your entrepreneurial spirit. You'll embrace differences and take bold actions to drive the change needed to meet global healthcare and sustainability challenges. Here, diverse minds and bold disruptors can meaningfully impact the future of healthcare using cutting-edge technology. Whether you join us in Bengaluru or Chennai, you can make a tangible impact within a global biopharmaceutical company that invests in your future. Join a talented global team that's powering AstraZeneca to better serve patients every day. Success Profile Ready to make an impact in your career? If you're passionate, growth-orientated and a true team player, we'll help you succeed. Here are some of the skills and capabilities we look for. 0% Tech innovators Make a greater impact through our digitally enabled enterprise. Use your skills in data and technology to transform and optimise our operations, helping us deliver meaningful work that changes lives. 0% Ownership takers If you're a self-aware self-starter who craves autonomy, AstraZeneca provides the perfect environment to take ownership and grow. Here, you'll feel empowered to lead and reach excellence at every level — with unrivalled support when you need it. 0% Challenge seekers Adapting and advancing our progress means constantly challenging the status quo. In this dynamic environment where everything we do has urgency and focus, you'll have the ability to show up, speak up and confidently take smart risks. 0% Proactive collaborators Your unique perspectives make our ambitions and capabilities possible. Our culture of sharing ideas, learning and improving together helps us consistently set the bar higher. As a proactive collaborator, you'll seek out ways to bring people together to achieve their best. Responsibilities Job ID R-226449 Date posted 05/23/2025 Job Title: SAP Analytics Cloud Specialist Career Level : D2 Introduction to role Are you ready to make a significant impact in the world of analytics? Join AstraZeneca's Process Insights team within Global Business Services (GBS) as an SAP Analytics Cloud Specialist. We are on a mission to transform business processes through automation, analytics, and AI capabilities. As we scale our capabilities, you'll play a pivotal role in delivering SAP analytics solutions that drive progress across AstraZeneca. Accountabilities Collaborate with stakeholders to understand their business process requirements and objectives, translating them into SAP Analytics solutions (SAC & Datasphere). Create Extract, Transform, and Load (ETL) data pipelines, data warehousing, and testing. Validate and assure data quality and accuracy, including data cleansing, enrichment, and building data models. Develop comprehensive analytics and dashboards for business collaborators for reporting, business planning, and critical metric tracking purposes. Enhance solution experiences and visualizations using low/no-code development. Essential Skills/Experience Degree in Computer Science, Business Informatics or a comparable degree. Overall 8-10 years of experience and at least 2 years’ experience working on SAP SAC / Datasphere solutions as a Data Analyst and/or Data Engineer. Experience in SAP Datasphere, ETL, building data pipelines, preparing and integrating data, data modelling, understanding of relational data modelling and denormalization techniques. Experience in SAP Analytics Cloud in creating advanced analytics/dashboards i.e. stories, boardrooms, planning. Knowledge of analytics standard processes. Understanding of SAP related Finance and/or Operations processes will be valued. Certification in one or more of the following will be appreciated: SAC Data Analyst, Data Engineer, Low-Code/No-Code Developer. Good communication skills and ability to work in an Agile environment. Energetic, organized and self-motivated. Fluent in business English. Desirable Skills/Experience NA When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. AstraZeneca is a dynamic company where innovation is at the forefront of everything we do. Here, you can apply your skills to genuinely impact patients' lives while being part of a global team that drives excellence and breakthroughs. With a focus on digital transformation and leveraging radical technologies, we offer an environment where you can challenge norms, take ownership, and make quick decisions. Our commitment to sustainability and empowering our teams ensures that every action contributes to a greater purpose. Ready to take the next step in your career? Apply now and be part of our journey towards transforming healthcare through analytics! AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements. APPLY NOW Explore the local area Take a look at the map to see what’s nearby. Reasons to Join Thomas Mathisen Sales Representative Oslo, Norway Christine Recchio Sales Representative California, United States Stephanie Ling Sales Representative Petaling Jaya, Malaysia What we offer We're driven by our shared values of serving people, society and the planet. Our people make this possible, which is why we prioritise diversity, safety, empowerment and collaboration. Discover what a career at AstraZeneca could mean for you. Lifelong learning Our development opportunities are second to none. You'll have the chance to grow your abilities, skills and knowledge constantly as you accelerate your career. From leadership projects and constructive coaching to overseas talent exchanges and global collaboration programmes, you'll never stand still. Autonomy and reward Experience the power of shaping your career how you want to. We are a high-performing learning organisation with autonomy over how we learn. Make big decisions, learn from your mistakes and continue growing — with performance-based rewards as part of the package. Health and wellbeing An energised work environment is only possible when our people have a healthy work-life balance and are supported for their individual needs. That's why we have a dedicated team to ensure your physical, financial and psychological wellbeing is a top priority. Inclusion and diversity Diversity and inclusion are embedded in everything we do. We're at our best and most creative when drawing on our different views, experiences and strengths. That's why we're committed to creating a workplace where everyone can thrive in a culture of respect, collaboration and innovation.

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

OP is looking for a seasoned Solutions Architect to join our dynamic Architecture team and help shape the future of digital transformation. You’ll work alongside visionary technology leaders to craft cutting-edge solutions that address complex business challenges. Ideal candidates will have strong experience in digital, mobile, and cloud technologies and the ability to lead high-impact initiatives from ideation through execution. This is an opportunity to make a tangible difference by delivering next-generation applications that drive business success. Responsibilities Oversee the solution architecture and design for key projects, delivering accurate estimates and coordinating with architects and designers across solution, infrastructure, and data disciplines to effectively address business challenges. Collaborate with delivery teams, production support, and Shared Services partners (such as Quality Assurance, Infrastructure Engineering, and Reference Architecture) to ensure alignment of solution strategies and estimates. Work closely with business stakeholders to identify problems, create new business capabilities, and design technology solutions that drive success, ensuring all solutions align with business requirements while emphasizing performance, scalability, security, and cost-efficiency. Provide architectural guidance by collaborating with portfolio teams, IT departments, and external partners. Present strategies, incorporate feedback, and foster collaboration with cross-functional technical teams. Embed within Scrum teams by engaging in daily standups and ceremonies, providing architectural direction, and guiding and mentoring technical teams through complex architectural challenges while ensuring alignment with best practices and project goals. Continuously assess and recommend specific tools, platforms, and frameworks that meet evolving project needs, ensuring high compatibility and efficiency. Promote and implement modular design principles to facilitate independent component development and testing, while consistently applying best security practices such as least privilege and data protection across all systems. Conduct quality and security assurance, developing metrics to drive and maintain code quality standards, and ensuring adherence to automated code review processes. Evaluate design options by creating high-level cost estimates for various architectural approaches, ensuring solutions are scalable, secure, and high-performing with a focus on cost-efficiency. Design and review high availability and disaster recovery architectures, proactively identifying areas for improvement and remediating issues to meet project and enterprise standards. Qualifications Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. 10+ years of IT experience, with at least 3 years as a developer and 3 years as a solution architect. AWS Certified Solution Architect certification strongly preferred. TOGAF certification is a plus. Experience Application Programming languages such as Java, Python, .NET, or similar. Java preferred. JavaScript frameworks such as Angular, React are used for building user interfaces. RESTful APIs, GraphQL, and SOAP for interaction between applications and services. Event streaming and messaging brokers like Apache Kafka, AWS Kinesis, AWS SNS, and SQS, or ActiveMQ. Batch Processing (e.g., ETL and Spring Batch) Microservices architecture Serverless, including AWS Lambda Containerization, including Docker and Kubernetes Design patterns like MVC (Model-View-Controller), Strangler, and SOA (Service-Oriented Architecture) API gateway and management tools like Apigee and Amazon API Gateway Domain Driven Design Integration platforms like Spring Integration for connecting diverse systems. Mobile app development frameworks (e.g., Ionic, Capacitor, React Native, Flutter, or Swift) Workflow and process engines such as AWS Step Functions, Camunda, Flowable, and Pega. Document management systems like Hyland Alfresco. Content management platforms like Adobe AEM and a general eCommerce experience. Testing tools like Selenium, JUnit, or TestNG for creating automated unit, integration, and performance tests. Cloud and DevOps Architecture and detailed design of solutions using cloud platforms like AWS, Microsoft Azure, or Google Cloud. DevOps, including CI/CD pipelines (e.g., GitHub Actions) Infrastructure as Code (e.g., Terraform and OpenTofu) Data SQL databases such as Oracle, PostgreSQL, or Microsoft SQL Server for structured data storage. NoSQL databases like DynamoDB are used for handling unstructured or semi-structured data. Normalizing data models and understanding the trade-offs of denormalization in large-scale systems. Enterprise data architecture includes operational data stores, data replication, data lakes, and data warehousing. Cyber and Privacy Security frameworks like ISO 27001, NIST, or GDPR compliance. Compliance standards like CCPA, GDPR, particularly important when dealing with sensitive business data. Secure coding practices and principles, such as OWASP, encryption techniques, and identity management. Authentication protocols (OAuth, JWT) and identity management solutions (e.g., Azure AD, ForgeRock, SailPoint). Benefits Health Insurance, Accident Insurance. The salary will be determined based on several factors, including, but not limited to, location, relevant education, qualifications, experience, technical skills, and business needs. Additional Responsibilities Participate in OP monthly team meetings, and participate in team-building efforts. Contribute to OP technical discussions, peer reviews, etc. Contribute content and collaborate via the OP-Wiki/Knowledge Base. Provide status reports to OP Account Management as requested. About Us OP is a technology consulting and solutions company, offering advisory and managed services, innovative platforms, and staffing solutions across a wide range of fields — including AI, cyber security, enterprise architecture, and beyond. Our most valuable asset is our people: dynamic, creative thinkers, who are passionate about doing quality work. As a member of the OP team, you will have access to industry-leading consulting practices, strategies & and technologies, innovative training & education. An ideal OP team member is a technology leader with a proven track record of technical excellence and a strong focus on process and methodology. Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

About the role: We are looking for an experienced Data Modeler, who will be responsible for designing and implementing data models that support the organization's master data, analytics & other data management needs. You will work closely with stakeholders to ensure data models align with business requirements and optimize performance & scalability. Your role will involve maintaining data integrity and security, establishing data modeling standards, and staying updated with industry trends around data modeling. What you will do: Design, Develop and Implement Data Models: Create conceptual, logical, and physical data models tailored to organizational requirements. Stakeholder Collaboration: Partner with business analysts, data architects, database administrators, and other IT professionals to comprehend data needs and ensure alignment with business objectives. Data Integrity and Security: Establish and maintain data integrity, stability, and security across various databases. Database Performance Optimization: Enhance and fine-tune data models for optimal performance, ensuring efficient data storage, retrieval, and analysis. Data Modeling Standards: Define and enforce organizational standards and best practices for data modeling. Data Model Maintenance: Oversee the maintenance of data models, data lineage, and metadata management. Data Issue Resolution: Identify, track, and resolve any data-related issues. Data Model Documentation and Communication: Document and convey data models and their specifications to stakeholders effectively. Use of Data Modeling Tools: Employ data modeling tools and software for the creation and management of data models. What you will need: Strong Data professional with at least 3 years of experience in hands on data modeling role & over 7 years of overall experience in a data driven profession. Must have: Bachelor’s degree in computer science, Information Technology, or a related field. 3+ Years of demonstrated experience as Data Modeler. Must have maintained large enterprise grade data model. Proficiency in data modeling techniques and methodologies. Strong understanding around data normalization, denormalization, dimensional modeling, warehousing concepts & schema design. Hands-on experience with data modeling tools such as ERwin. Strong knowledge of SQL and database management systems like PostgreSQL, MySQL, Oracle or SQL Server. Experienced in data profiling, data sampling, and data conversion exercises. Experience in Agile or Scrum environments. Strong communication and collaboration skills with both technical and non-technical stakeholders. Nice to have: Experience in working with MPP databases like Synapse, Snowflake etc. Good understanding of business process around master data management in a B2B setup. Background in data governance and data quality concepts. Experience with databricks & Power BI. Who you are: Curious learner who can maturely handle critical enterprise projects with multiple stakeholders. Able to work independently or within a team proactively in a fast-paced AGILE environment. Owns success – Takes responsibility for the successful delivery of the solutions. Strong desire to improve upon their skills in tools and technologies Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work. What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com. Job Requisition ID:99834 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser. Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

About the role: We are looking for an experienced Data Modeler, who will be responsible for designing and implementing data models that support the organization's master data, analytics & other data management needs. You will work closely with stakeholders to ensure data models align with business requirements and optimize performance & scalability. Your role will involve maintaining data integrity and security, establishing data modeling standards, and staying updated with industry trends around data modeling. What you will do: Design, Develop and Implement Data Models: Create conceptual, logical, and physical data models tailored to organizational requirements. Stakeholder Collaboration: Partner with business analysts, data architects, database administrators, and other IT professionals to comprehend data needs and ensure alignment with business objectives. Data Integrity and Security: Establish and maintain data integrity, stability, and security across various databases. Database Performance Optimization: Enhance and fine-tune data models for optimal performance, ensuring efficient data storage, retrieval, and analysis. Data Modeling Standards: Define and enforce organizational standards and best practices for data modeling. Data Model Maintenance: Oversee the maintenance of data models, data lineage, and metadata management. Data Issue Resolution: Identify, track, and resolve any data-related issues. Data Model Documentation and Communication: Document and convey data models and their specifications to stakeholders effectively. Use of Data Modeling Tools: Employ data modeling tools and software for the creation and management of data models. What you will need: Strong Data professional with at least 3 years of experience in hands on data modeling role & over 7 years of overall experience in a data driven profession. Must have: Bachelor’s degree in computer science, Information Technology, or a related field. 3+ Years of demonstrated experience as Data Modeler. Must have maintained large enterprise grade data model. Proficiency in data modeling techniques and methodologies. Strong understanding around data normalization, denormalization, dimensional modeling, warehousing concepts & schema design. Hands-on experience with data modeling tools such as ERwin. Strong knowledge of SQL and database management systems like PostgreSQL, MySQL, Oracle or SQL Server. Experienced in data profiling, data sampling, and data conversion exercises. Experience in Agile or Scrum environments. Strong communication and collaboration skills with both technical and non-technical stakeholders. Nice to have: Experience in working with MPP databases like Synapse, Snowflake etc. Good understanding of business process around master data management in a B2B setup. Background in data governance and data quality concepts. Experience with databricks & Power BI. Who you are: Curious learner who can maturely handle critical enterprise projects with multiple stakeholders. Able to work independently or within a team proactively in a fast-paced AGILE environment. Owns success – Takes responsibility for the successful delivery of the solutions. Strong desire to improve upon their skills in tools and technologies Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work. What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com. Job Requisition ID:99834 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

Design and develop conceptual, logical, and physical data models for enterprise and application-level databases. Translate business requirements into well-structured data models that support analytics, reporting, and operational systems. Define and maintain data standards, naming conventions, and metadata for consistency across systems. Collaborate with data architects, engineers, and analysts to implement models into databases and data warehouses/lakes. Analyze existing data systems and provide recommendations for optimization, refactoring, and improvements. Create entity relationship diagrams (ERDs) and data flow diagrams to document data structures and relationships. Support data governance initiatives including data lineage, quality, and cataloging. Review and validate data models with business and technical stakeholders. Provide guidance on normalization, denormalization, and performance tuning of database designs. Ensure models comply with organizational data policies, security, and regulatory requirements. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

Design and develop conceptual, logical, and physical data models for enterprise and application-level databases. Translate business requirements into well-structured data models that support analytics, reporting, and operational systems. Define and maintain data standards, naming conventions, and metadata for consistency across systems. Collaborate with data architects, engineers, and analysts to implement models into databases and data warehouses/lakes. Analyze existing data systems and provide recommendations for optimization, refactoring, and improvements. Create entity relationship diagrams (ERDs) and data flow diagrams to document data structures and relationships. Support data governance initiatives including data lineage, quality, and cataloging. Review and validate data models with business and technical stakeholders. Provide guidance on normalization, denormalization, and performance tuning of database designs. Ensure models comply with organizational data policies, security, and regulatory requirements. Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

About The Role We are looking for an experienced Data Modeler, who will be responsible for designing and implementing data models that support the organization's master data, analytics & other data management needs. You will work closely with stakeholders to ensure data models align with business requirements and optimize performance & scalability. Your role will involve maintaining data integrity and security, establishing data modeling standards, and staying updated with industry trends around data modeling. What You Will Do Design, Develop and Implement Data Models: Create conceptual, logical, and physical data models tailored to organizational requirements. Stakeholder Collaboration: Partner with business analysts, data architects, database administrators, and other IT professionals to comprehend data needs and ensure alignment with business objectives. Data Integrity and Security: Establish and maintain data integrity, stability, and security across various databases. Database Performance Optimization: Enhance and fine-tune data models for optimal performance, ensuring efficient data storage, retrieval, and analysis. Data Modeling Standards: Define and enforce organizational standards and best practices for data modeling. Data Model Maintenance: Oversee the maintenance of data models, data lineage, and metadata management. Data Issue Resolution: Identify, track, and resolve any data-related issues. Data Model Documentation and Communication: Document and convey data models and their specifications to stakeholders effectively. Use of Data Modeling Tools: Employ data modeling tools and software for the creation and management of data models. What You Will Need Strong Data professional with at least 3 years of experience in hands on data modeling role & over 7 years of overall experience in a data driven profession. Must Have Bachelor’s degree in computer science, Information Technology, or a related field. 3+ Years of demonstrated experience as Data Modeler. Must have maintained large enterprise grade data model. Proficiency in data modeling techniques and methodologies. Strong understanding around data normalization, denormalization, dimensional modeling, warehousing concepts & schema design. Hands-on experience with data modeling tools such as ERwin. Strong knowledge of SQL and database management systems like PostgreSQL, MySQL, Oracle or SQL Server. Experienced in data profiling, data sampling, and data conversion exercises. Experience in Agile or Scrum environments. Strong communication and collaboration skills with both technical and non-technical stakeholders. Nice To Have Experience in working with MPP databases like Synapse, Snowflake etc. Good understanding of business process around master data management in a B2B setup. Background in data governance and data quality concepts. Experience with databricks & Power BI. Who You Are Curious learner who can maturely handle critical enterprise projects with multiple stakeholders. Able to work independently or within a team proactively in a fast-paced AGILE environment. Owns success – Takes responsibility for the successful delivery of the solutions. Strong desire to improve upon their skills in tools and technologies Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work . What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com . Job Requisition ID:99834 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser. Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Job Title: Senior Data Engineer – Microsoft Fabric Experience: 5–8 Years Location: Kochi Notice Period: Immediate Job Description: We are seeking an experienced Senior Data Engineer with a strong focus on Microsoft Fabric. This role is ideal for professionals with a deep understanding of modern data engineering principles and a track record of building scalable, high-performance data solutions in a cloud-native environment. Key Responsibilities: Design, develop, and optimize scalable data pipelines using Microsoft Fabric tools such as Data Factory and Dataflows Gen2 Create and maintain data models, semantic models, and data marts for analytics and reporting Develop SQL-based ETL processes to integrate data from structured and unstructured sources Collaborate with BI developers and analysts to create Power BI datasets, reports, and dashboards Implement robust data integration from diverse platforms and sources (on-premise and cloud) Ensure data quality, governance, and integrity with automated validation and error handling Translate business data needs into technical specifications Optimize workflows for performance and cost in cloud-first environments Provide mentorship to junior data engineers Required Skills: Hands-on experience with Microsoft Fabric (Dataflows Gen2, Pipelines, OneLake) Strong proficiency in Power BI, semantic modeling, and report/dashboard development Expertise in data modeling (star schema, snowflake, normalization/denormalization) Advanced SQL skills including stored procedures and performance tuning Experience integrating data from APIs, flat files, databases, and streaming sources Knowledge of data governance, data lineage, and cataloging in the Microsoft ecosystem Strong analytical and problem-solving skills with the ability to handle large datasets Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

TCS HIRING!! ROLE: AWS Data Architect YEAR OF EXP: 8 + YEARS LOCATION: Chennai/ Pune / Bangalore / Hyd Data Architect: Must have: Relational SQL/ Caching expertise – Deep knowledge of Amazon Aurora PostgreSQL, ElastiCache etc.. Data modeling – Experience in OLTP and OLAP schemas, normalization, denormalization, indexing, and partitioning. Schema design & migration – Defining best practices for schema evolution when migrating from SQL Server to PostgreSQL. Data governance – Designing data lifecycle policies, archival strategies, and regulatory compliance frameworks. AWS Glue & AWS DMS – Leading data migration strategies to Aurora PostgreSQL. ETL & Data Pipelines – Expertise in Extract, Transform, Load (ETL) workflows . Glue jobs features and event-driven architectures. Data transformation & mapping – PostgreSQL PL/pgSQL migration / transformation expertise while ensuring data integrity. Cross-platform data integration – Connecting cloud and on-premises / other cloud data sources. AWS Data Services – Strong experience in S3, Glue, Lambda, Redshift, Athena, and Kinesis. Infrastructure as Code (IaC) – Using Terraform, CloudFormation, or AWS CDK for database provisioning. Security & Compliance – Implementing IAM, encryption (AWS KMS), access control policies, and compliance frameworks (eg. GDPR ,PII). Query tuning & indexing strategies – Optimizing queries for high performance. Capacity planning & scaling – Ensuring high availability, failover mechanisms, and auto-scaling strategies. Data partitioning & storage optimization – Designing cost-efficient hot/cold data storage policies. Should have experience with setting up the AWS architecture as per the project requirements Good to have: Data Warehousing – Expertise in Amazon Redshift, Snowflake, or BigQuery. Big Data Processing – Familiarity with Apache Spark, EMR, Hadoop, or Kinesis. Data Lakes & Analytics – Experience in AWS Lake Formation, Glue Catalog, and Athena. Machine Learning Pipelines – Understanding of SageMaker, BedRock etc. for AI-driven analytics. CI/CD for Data Pipelines – Knowledge of AWS CodePipeline, Jenkins, or GitHub Actions. Serverless Data Architectures – Experience with event-driven systems (SNS, SQS, Step Functions). Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

TCS HIRING!! ROLE: AWS Data architect LOCATION: HYDERABAD YEAR OF EXP: 8 + YEARS Data Architect: Must have: Relational SQL/ Caching expertise – Deep knowledge of Amazon Aurora PostgreSQL, ElastiCache etc.. Data modeling – Experience in OLTP and OLAP schemas, normalization, denormalization, indexing, and partitioning. Schema design & migration – Defining best practices for schema evolution when migrating from SQL Server to PostgreSQL. Data governance – Designing data lifecycle policies, archival strategies, and regulatory compliance frameworks. AWS Glue & AWS DMS – Leading data migration strategies to Aurora PostgreSQL. ETL & Data Pipelines – Expertise in Extract, Transform, Load (ETL) workflows . Glue jobs features and event-driven architectures. Data transformation & mapping – PostgreSQL PL/pgSQL migration / transformation expertise while ensuring data integrity. Cross-platform data integration – Connecting cloud and on-premises / other cloud data sources. AWS Data Services – Strong experience in S3, Glue, Lambda, Redshift, Athena, and Kinesis. Infrastructure as Code (IaC) – Using Terraform, CloudFormation, or AWS CDK for database provisioning. Security & Compliance – Implementing IAM, encryption (AWS KMS), access control policies, and compliance frameworks (eg. GDPR ,PII). Query tuning & indexing strategies – Optimizing queries for high performance. Capacity planning & scaling – Ensuring high availability, failover mechanisms, and auto-scaling strategies. Data partitioning & storage optimization – Designing cost-efficient hot/cold data storage policies. Should have experience with setting up the AWS architecture as per the project requirements Good to have: Data Warehousing – Expertise in Amazon Redshift, Snowflake, or BigQuery. Big Data Processing – Familiarity with Apache Spark, EMR, Hadoop, or Kinesis. Data Lakes & Analytics – Experience in AWS Lake Formation, Glue Catalog, and Athena. Machine Learning Pipelines – Understanding of SageMaker, BedRock etc. for AI-driven analytics. CI/CD for Data Pipelines – Knowledge of AWS CodePipeline, Jenkins, or GitHub Actions. Serverless Data Architectures – Experience with event-driven systems (SNS, SQS, Step Functions). Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible. As a Lead Software Engineer at JPMorgan Chase within Commercial and Investment Bank - Commercial Cards team, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. You will be responsible for designing, developing, and maintaining data models that support the organization's data architecture and business intelligence initiatives. This role involves working closely with data architects, business analysts, and other stakeholders to ensure that data models meet business requirements and are aligned with industry best practices. Job Responsibilities Designs and develops conceptual, logical, and physical data models to support data integration, data warehousing, and business intelligence solutions. Collaborates with business analysts and stakeholders to gather and understand data requirements and translate them into data models. Ensures data models are optimized for performance, scalability, and maintainability. Works with data architects to ensure data models align with the overall data architecture and strategy. Develops and maintains data dictionaries, metadata repositories, and data lineage documentation. Conducts data model reviews and provide recommendations for improvements. Supports data governance initiatives by ensuring data models adhere to data quality and data management standards. Assists in the development and implementation of data modeling best practices and standards. Provides support and guidance to development teams during the implementation of data models. Stays up-to-date with industry trends and advancements in data modeling techniques and tools. Required Qualifications, Capabilities, And Skills Formal training or certification on software engineering* concepts and 5+ years applied experience Proven experience as a Data Modeler or in a similar role for more than 5 years Strong understanding of data modeling concepts, including normalization, denormalization, and dimensional modeling. Proficiency in data modeling tools such as ER/Studio, ERwin, or similar. Experience with relational databases (e.g., Oracle, SQL Server, MySQL) and data warehousing solutions. Knowledge of data integration and ETL processes. Excellent analytical and problem-solving skills. Strong communication and collaboration skills. Ability to work independently and as part of a team ABOUT US Show more Show less

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Entity: Finance Job Family Group: Business Support Group Job Description: We are a global energy business involved in every aspect of the energy system. We are working towards delivering light, heat, and mobility to millions of people every day. We are one of the very few companies equipped to solve some of the big complex challenges that matter for the future. We have a real contribution to make to the world's ambition of a low-carbon future. Join us and be part of what we can accomplish together. You can participate in our new ambition to become a net zero company by 2050 or sooner and help the world get to net zero. Would you like to discover how our diverse, hardworking people are leading the way in making energy cleaner and better – and how you can play your part in our outstanding team? Join our Finance Team and advance your career as a Data Analyst Key Accountability Perform investigation data analysis and exploratory data mining to uncover insights and trends. Craft and build scalable data pipelines using Azure Databricks, Spark, and Delta Lake. Collaborate with data engineers and architects to craft and implement efficient data pipelines in the Databricks platform. Build pyspark based analytical data asset in finance to solve sophisticated business problems. Apply expertise in star and snowflake schema structures and data models to optimize data storage and retrieval. Perform performance tuning on SPARK SQL queries and pyspark dataframe, optimize data transformations, and improve query execution times for large data sources. Design, develop, and maintain data structures and tables in Databricks Delta Lake for optimal storage and querying. Work closely with multi-functional teams to understand business requirements and translate them into effective data solutions. Create clear and concise user documentation for developed code, data transformations, and performance enhancements. Analyze and comprehend existing SAP HANA SQL logic and procedures to identify conversion opportunities and optimization strategies for ADH Databricks SPARK SQL. Stay up-to-date with industry trends and standard methodologies in data processing, performance optimization, and cloud technologies. Education and Qualification : Bachelor's degree or equivalent experience in Computer Science, Information Technology, or a related field. Master's degree is a plus. 7+ years of experience as a Data Analyst or similar role, with a consistent track record of building data products/applications using pyspark. Expertise in using Azure Databricks platform to build data pipelines. Good understanding of data models, star and snowflake schema structures, and data normalization/denormalization principles. Proficiency in Apache SPARK, Databricks, and SPARK SQL for data processing, transformation, and performance optimization. Experience with performance tuning, query optimization, and fix SPARK SQL execution bottlenecks. Exposure to advanced analytics, machine learning models, or working alongside Data Science teams. Familiarity with cloud platforms such as AWS, Azure, and their data services. Familiarity with data visualization tools like Power BI, Tableau, or similar. Excellent problem-solving skills and attention to detail, with the ability to analyze complex data scenarios and provide effective solutions. Good communication skills to collaborate with technical and non-technical collaborators and convey complex technical concepts. Ability to work independently and within a team, managing multiple tasks and projects simultaneously. Travel Requirement Negligible travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies