Jobs
Interviews

5899 Data Warehousing Jobs - Page 29

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a Data Analytics Engineer to design data models, build ETL pipelines, and deliver analytical solutions to support data-driven decisions. Key Responsibilities: Develop and maintain data pipelines for analytics and reporting. Design data warehouses or data lakes to support BI tools. Implement data quality, validation, and governance processes. Collaborate with business teams to translate requirements into datasets. Optimize query performance for large-scale analytics. Required Skills & Qualifications: Strong SQL and experience with data warehouse platforms (Snowflake, Redshift, BigQuery). Proficiency in Python or Scala for data processing. Knowledge of ETL tools (Airflow, Talend, dbt). Experience with BI tools (Tableau, Power BI, Looker) is a plus. Understanding of data modeling (star/snowflake schema, normalization). Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Reddy Delivery Manager Integra Technologies

Posted 2 weeks ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Pune

Work from Office

Your Role As a Data Modeler , you will play a critical role in designing and implementing robust data models that support enterprise data warehousing and analytics initiatives. You will Apply your strong foundation in data structures, algorithms, calculus, linear algebra, and machine learning to design scalable and efficient data models. Leverage your expertise in data warehousing concepts such as Star Schema, Snowflake Schema, and Data Vault to architect and optimize data marts and enterprise data warehouses. Utilize industry-standard data modeling tools like Erwin, ER/Studio, and MySQL Workbench to create and maintain logical and physical data models. Thrive in a fast-paced, dynamic environment, collaborating with cross-functional teams to deliver high-quality data solutions under tight deadlines. Demonstrate strong conceptual modeling skills, with the ability to see the big picture and design solutions that align with business goals. Exhibit excellent communication and stakeholder management skills, effectively translating complex technical concepts into clear, actionable insights for both technical and non-technical audiences. Your Profile Good knowledge and expertise on data structures and algorithms and calculus, linear algebra, machine learning and modeling. Experience in data warehousing concepts including Star schema, snowflake or data vault for data mart or data warehousing Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Experience in working in a challenging , fast-paced environment Expertise in conceptual modelling; ability to see the big picture and envision possible solutions Excellent communication & stakeholder management skills. Experience in working in a challenging, fast-paced environment Excellent communication & stakeholder management skills. What youll love about working here You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Chennai

Work from Office

About The Role Business Advisors shape the vision and strategy with the client, understand the needs of the users/stakeholders, carry out an elicitation of processes, data and capabilities and derive the target processes and the business requirements for the current and future solution. - Grade Specific Performs analysis of processes, systems, data and business information and research, and builds up domain knowledge. Skills (competencies) Abstract Thinking Active Listening Agile (Software Development Framework) Analytical Thinking Backlog Grooming Business Architecture Modeling Business Process Modeling (e.g. BPMN) Change Management Coaching Collaboration Commercial Acumen Conceptual Data Modeling Conflict Management Confluence Critical Thinking CxO Conversations Data Analysis Data Requirements Management Decision-Making Emotional Intelligence Enterprise Architecture Modelling Facilitation Functional IT Architecture Modelling Giving Feedback Google Cloud Platform (GCP) (Cloud Platform) Influencing Innovation Jira Mediation Mentoring Microsoft Office Motivation Negotiation Networking Power BI Presentation skills Prioritization Problem Solving Project Governance Project Management Project Planning Qlik Relationship-Building Requirements Gathering Risk Management Scope Management SQL Stakeholder Management Story Mapping Storytelling Strategic Management Strategic tThinking SWOT Analysis Systems Requirement Analysis (or Management) Tableau Trusted Advisor UI-Design / Wireframing UML User Journey User Research Verbal Communication Written Communication

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

About The Role Business Advisors shape the vision and strategy with the client, understand the needs of the users/stakeholders, carry out an elicitation of processes, data and capabilities and derive the target processes and the business requirements for the current and future solution. - Grade Specific Performs analysis of processes, systems, data and business information and research, and builds up domain knowledge. Skills (competencies) Abstract Thinking Active Listening Agile (Software Development Framework) Analytical Thinking Backlog Grooming Business Architecture Modeling Business Process Modeling (e.g. BPMN) Change Management Coaching Collaboration Commercial Acumen Conceptual Data Modeling Conflict Management Confluence Critical Thinking CxO Conversations Data Analysis Data Requirements Management Decision-Making Emotional Intelligence Enterprise Architecture Modelling Facilitation Functional IT Architecture Modelling Giving Feedback Google Cloud Platform (GCP) (Cloud Platform) Influencing Innovation Jira Mediation Mentoring Microsoft Office Motivation Negotiation Networking Power BI Presentation skills Prioritization Problem Solving Project Governance Project Management Project Planning Qlik Relationship-Building Requirements Gathering Risk Management Scope Management SQL Stakeholder Management Story Mapping Storytelling Strategic Management Strategic tThinking SWOT Analysis Systems Requirement Analysis (or Management) Tableau Trusted Advisor UI-Design / Wireframing UML User Journey User Research Verbal Communication Written Communication

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Noida

Work from Office

About The Role Business Advisors shape the vision and strategy with the client, understand the needs of the users/stakeholders, carry out an elicitation of processes, data and capabilities and derive the target processes and the business requirements for the current and future solution. - Grade Specific Performs analysis of processes, systems, data and business information and research, and builds up domain knowledge. Skills (competencies) Abstract Thinking Active Listening Agile (Software Development Framework) Analytical Thinking Backlog Grooming Business Architecture Modeling Business Process Modeling (e.g. BPMN) Change Management Coaching Collaboration Commercial Acumen Conceptual Data Modeling Conflict Management Confluence Critical Thinking CxO Conversations Data Analysis Data Requirements Management Decision-Making Emotional Intelligence Enterprise Architecture Modelling Facilitation Functional IT Architecture Modelling Giving Feedback Google Cloud Platform (GCP) (Cloud Platform) Influencing Innovation Jira Mediation Mentoring Microsoft Office Motivation Negotiation Networking Power BI Presentation skills Prioritization Problem Solving Project Governance Project Management Project Planning Qlik Relationship-Building Requirements Gathering Risk Management Scope Management SQL Stakeholder Management Story Mapping Storytelling Strategic Management Strategic tThinking SWOT Analysis Systems Requirement Analysis (or Management) Tableau Trusted Advisor UI-Design / Wireframing UML User Journey User Research Verbal Communication Written Communication

Posted 2 weeks ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Bengaluru

Work from Office

About The Role Seeking a skilled and detail-oriented OAS/OBIEE Consultant to join our data and analytics team. The ideal candidate will be responsible for designing, developing, and maintaining business intelligence (BI) and dashboarding solutions to support smelter operations and decision-making processes. You will work closely with cross-functional teams to transform raw data into actionable insights using modern BI tools and ETL processes. Key Responsibilities: Develop and maintain interactive dashboards and reports using Microsoft Power BI and Oracle Analytics . Design and implement ETL processes using Oracle Data Integrator and other tools to ensure efficient data integration and transformation. Collaborate with stakeholders to gather business requirements and translate them into technical specifications. Perform data analysis and validation to ensure data accuracy and consistency across systems. Optimize queries and data models for performance and scalability. Maintain and support Oracle Database and other RDBMS platforms used in analytics workflows. Ensure data governance, quality, and security standards are met. Provide technical documentation and user training as needed. Required Skills and Qualifications: Proven experience in BI solutions , data analysis , and dashboard development . Strong hands-on experience with Microsoft Power BI , Oracle Analytics , and Oracle Data Integrator . Proficiency in Oracle Database , SQL , and relational database concepts. Solid understanding of ETL processes , data management , and data processing . Familiarity with business intelligence and business analytics best practices. Strong problem-solving skills and attention to detail. Excellent communication and collaboration abilities. Preferred Qualifications: Experience in the smelting or manufacturing industry is a plus. Knowledge of scripting languages (e.g., Python, Shell) for automation. Certification in Power BI, Oracle Analytics, or related technologies.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Tiruchirapalli

Work from Office

About The Role Business Advisors shape the vision and strategy with the client, understand the needs of the users/stakeholders, carry out an elicitation of processes, data and capabilities and derive the target processes and the business requirements for the current and future solution. - Grade Specific Performs analysis of processes, systems, data and business information and research, and builds up domain knowledge. Skills (competencies) Abstract Thinking Active Listening Agile (Software Development Framework) Analytical Thinking Backlog Grooming Business Architecture Modeling Business Process Modeling (e.g. BPMN) Change Management Coaching Collaboration Commercial Acumen Conceptual Data Modeling Conflict Management Confluence Critical Thinking CxO Conversations Data Analysis Data Requirements Management Decision-Making Emotional Intelligence Enterprise Architecture Modelling Facilitation Functional IT Architecture Modelling Giving Feedback Google Cloud Platform (GCP) (Cloud Platform) Influencing Innovation Jira Mediation Mentoring Microsoft Office Motivation Negotiation Networking Power BI Presentation skills Prioritization Problem Solving Project Governance Project Management Project Planning Qlik Relationship-Building Requirements Gathering Risk Management Scope Management SQL Stakeholder Management Story Mapping Storytelling Strategic Management Strategic tThinking SWOT Analysis Systems Requirement Analysis (or Management) Tableau Trusted Advisor UI-Design / Wireframing UML User Journey User Research Verbal Communication Written Communication

Posted 2 weeks ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Surat

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 2 weeks ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Varanasi

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 2 weeks ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Surat

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 2 weeks ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Varanasi

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 2 weeks ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Visakhapatnam

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 2 weeks ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Visakhapatnam

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications.

Posted 2 weeks ago

Apply

7.0 - 10.0 years

25 - 35 Lacs

Bengaluru

Work from Office

Preferred candidate profile BS in Computer Science or Related 5+ years of data engineering experience Good understanding of modern data platforms including data lakes and data warehouse, with good knowledge of the underlying architecture, preferably in Snowflake. Proven experience in assembling large, complex sets of data that meets non-functional and functional business requirements. Experience in identifying, designing, and implementing integration, modelling, and orchestration of complex Finance data and at the same time look for process improvements, optimize data delivery and automate manual processes. Working experience of scripting, data science and analytics (SQL, Python, PowerShell, JavaScript) Working experience of performance tuning and optimization, bottleneck problems analysis, and technical troubleshooting in a, sometimes, ambiguous environment. Working experience of working with cloud-based systems (Azure experience preferred) Desired Qualifications: Experience working with cloud-based systems - Azure & Snowflake data warehouses. Expertise in designing data table structures, reports, and queries. Working knowledge of CI/CD Working knowledge of building data integrity checks as part of delivery of applications Experience working with Kafka technologies. Technical expertise to build code that is performant as well as secure. Technical depth and vision to perform POCs and evaluate different technologies. Experience with Real Time Analytics and Real Time Messaging Working experience with Microservices is desirable. Design, implement and monitor 'best practices' for Dev framework. Experience working with large volume data; retail experience strongly desired. MS Degree in Computer Science or related technical degree completed. Possesses an entrepreneurial spirit and continuously innovates to achieve great results. Communicates with honesty & kindness and creates the space for others to do the same. Fosters connection by putting people first and building trusting relationships. Integrates fun and joy as a way of being and working, aka doesnt take themselves too seriously. Preferred Tools: Snowflake, Microsoft Azure Data Factory, Kafka, Oracle Exadata, Power BI, SSAS, Oracle Data Integrator

Posted 2 weeks ago

Apply

5.0 - 8.0 years

27 - 42 Lacs

Pune

Work from Office

JD for the Senior Windchill Data Migration Engineer. • 4+ years of experience in large scale data migration projects, preferably in Healthcare Devices manufacturing domain. • 7+ Years of total experience in IT industry. • Experience in using Windchill Bulk Migrator (WBM) tool for the large-scale data migrations to PTC Windchill. • Good experience with data migration ETLV (Extract, Transform, Load and Validation) concepts and tools • Experience in loading huge volume data different source systems to PTC Windchill using WBM or load from file approach. • Develop and execute data extraction and transformation scripts based on migration scope and migration procedure • Strong understanding of Windchill PLM and Manufacturing data model • Good knowledge about the Windchill Database and table structure. • Able to write SQL (Oracle) queries as needed for the work. • Strong analytical thinking and experience in data profiling and analysis • Strong & clear communicator who can communicate effectively with project team • Track data migration defects, analyze root cause, determine solution and support for timely resolution of defects related to extraction and transformation • Support for data cleansing and data construction activities

Posted 2 weeks ago

Apply

7.0 - 12.0 years

25 - 35 Lacs

Hyderabad

Work from Office

Be a part of our success story. Launch offers talented and motivated people the opportunity to do the best work of their lives in a dynamic and growing company. Through competitive salaries, outstanding benefits, internal advancement opportunities, and recognized community involvement, you will have the chance to create a career you can be proud of. Your new trajectory starts here at Launch. What we are looking for: We are looking for a Data Engineer for designing and building ETL pipelines, with a focus on Azure Data Factory and Microsoft Fabric services. Role: Data Engineer Location: Hyderabad Work Mode: WFO Years of experience: 5+ Years of Experience Mandatory Skills: 5+ years of hands-on experience designing and building ETL pipelines, with a focus on Azure Data Factory and Microsoft Fabric services. Expertise in Power BI, DAX Reporting. Expertise in implementing CI/CD pipelines and automating deployment processes using Azure DevOps. Hands-on experience migrating data from on-premises databases to Azure Cloud environments, including Managed Instances. Proven track record of implementing version control systems and managing data pipeline architecture. Strong SQL skills for developing and optimizing queries, stored procedures, and database performance. Familiarity with Delta Lake, Synapse Analytics, or other MS Fabric-specific technologies. Experience working in large, agile teams to deliver data solutions in an iterative, collaborative environment. Preferred Skills: Knowledge of Microsoft Fabric components such as Dataflows, Data Pipelines, and integration with Power BI for seamless analytics delivery. Understanding of data security practices, including data encryption and role-based access control in Azure. Experience with event-driven architectures using Azure Event Hubs or similar tools. Familiarity with DataOps principles to streamline pipeline monitoring and management. Excellent problem-solving skills and ability to quickly adapt to evolving project requirements. Any Certifications required: Good to have any certification related to Fabric, Azure Cloud but not mandatory. We are Navigators in the Age of Transformation: We use sophisticated technology to transform clients into the digital age, but our top priority is our positive impact on the human experience. We ease anxiety and fear around digital transformation and replace it with opportunity. Launch IT is an equal opportunity employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Launch IT is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation.

Posted 2 weeks ago

Apply

12.0 - 20.0 years

35 - 60 Lacs

Bengaluru

Work from Office

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Join the innovative team at Kyndryl as a Client Technical Solutioner and unlock your potential to shape the future of technology solutions. As a key player in our organization, you will embark on an exciting journey where you get to work closely with customers, understand their unique challenges, and provide them with cutting-edge technical solutions and services. Picture yourself as a trusted advisor – collaborating directly with customers to unravel their business needs, pain points, and technical requirements. Your expertise and deep understanding of our solutions will empower you to craft tailored solutions that address their specific challenges and drive their success. Your role as a Client Technical Solutioner is pivotal in developing domain-specific solutions for our cutting-edge services and offerings. You will be at the forefront of crafting tailored domain solutions and cost cases for both simple and complex, long-term opportunities, demonstrating we meet our customers' requirements while helping them overcome their business challenges. At Kyndryl, we believe in the power of collaboration and your expertise will be essential in supporting our Technical Solutioning and Solutioning Managers during customer technology and business discussions, even at the highest levels of Business/IT Director/LOB. You will have the chance to demonstrate the value of our solutions and products, effectively communicating their business and technical benefits to decision makers and customers. In this role, you will thrive as you create innovative technical solutions that align with industry trends and exceed customer expectations. Your ability to collaborate seamlessly with internal stakeholders will enable you to gather the necessary documents and technical insights to deliver compelling bid submissions. Not only will you define winning cost models for deals, but you will also lead these deals to profitability, ensuring the ultimate success of both our customers and Kyndryl. You will play an essential role in contract negotiations, up to the point of signature, and facilitate a smooth engagement hand-over process. As the primary source of engagement management and solution design within your technical domain, you will compile, refine, and take ownership of final solution documents. Your technical expertise will shine through as you present these documents in a professional and concise manner, showcasing your mastery of the subject matter. You’ll have the opportunity to contribute to the growth and success of Kyndryl by standardizing our go-to-market pitches across various industries. By creating differentiated propositions that align with market requirements, you will position Kyndryl as a leader in the industry, opening new avenues of success for our customers and our organization. Join us as a Client Technical Solutioner at Kyndryl and unleash your potential to shape the future of technical solutions while enjoying a stimulating and rewarding career journey filled with innovation, collaboration, and growth. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience 10 – 15 Years (Specialist Seller / Consultant) is a must with 3 – 4 years of relevant experience as Data. Hands on experience in the area of Data Platforms (DwH / Datalake) like Cloudera / Databricks / MS Data Fabric / Teradata / Apache Hadoop / BigQuery / AWS Big Data Solutions (EMR, Redshift, Kinesis) / Qlik etc. Proven past experience in modernizing legacy data / app & transforming them to cloud - architectures Strong understanding of data modelling and database design. Expertise in data integration and ETL processes. Knowledge of data warehousing and business intelligence concepts. Experience with data governance and data quality management Good Domain Experience in BFSI or Manufacturing area. Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). Strong understanding of data integration techniques, including ETL (Extract, Transform, Load), Processes, Data Pipelines, and Data Streaming using Python , Kafka for streams , Pyspark , DBT , and ETL services Understanding & Experience in Data Security principles - Data Masking / Encryption etc Knowledge of Data Governance principles and practices, including Data Quality, Data Lineage, Data Privacy, and Compliance. Knowledge of systems development, including system development life cycle, project management approaches and requirements, design, and testing techniques Excellent communication skills to engage with clients and influence decisions. High level of competence in preparing Architectural documentation and presentations. Must be organized, self-sufficient and can manage multiple initiatives simultaneously. Must have the ability to coordinate with other teams and vendors, independently Deep knowledge of Services offerings and technical solutions in a practice Demonstrated experience translating distinctive technical knowledge into actionable customer insights and solutions Prior consultative selling experience Externally recognized as an expert in the technology and/or solutioning areas, to include technical certifications supporting subdomain focus area(s) Responsible for Prospecting & Qualifying leads, do the relevant Product / Market Research independently, in response to Customer’s requirement / Pain Point. Advising and Shaping Client Requirements to produce high-level designs and technical solutions in response to opportunities and requirements from Customers and Partners. Work with both internal / external stakeholders to identify business requirements, develop solutions to meet those requirements / build the Opportunity. Understand & analyze the application requirements in Client RFPs Design software applications based on the requirements within specified architectural guidelines & constraints. Lead, Design and implement Proof of Concepts & Pilots to demonstrate the solution to Clients /prospects. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.

Posted 2 weeks ago

Apply

14.0 - 24.0 years

35 - 50 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Key Responsibilities: Platform Architecture Design : Lead the design and architecture of the digital platform , ensuring that the data infrastructure is scalable, secure, and reliable. Focus on utilizing AWS services (e.g., S3 , Redshift , Glue , Lambda , Kinesis ) and Databricks to build a robust, cloud-based data architecture. Data Integration & ETL Pipelines : Architect and implement ETL/ELT pipelines to integrate data from multiple sources (e.g., transactional databases, third-party services, APIs) into the platform, using AWS Glue , Databricks , and other tools for efficient data processing. Cloud Strategy & Deployment : Implement cloud-native solutions, leveraging AWS tools and Databricks for data storage, real-time processing, machine learning, and analytics. Design the platform to be cost-efficient, highly available, and easily scalable. Data Modelling : Develop and maintain data models for the platform that support business intelligence, reporting, and analytics. Ensure the data model design aligns with business requirements and the overall architecture of the platform. Machine Learning & Analytics Enablement : Work with data scientists and analysts to ensure that the architecture supports advanced analytics and machine learning workflows, enabling faster time to insights and model deployment. Data Security & Governance : Implement data governance frameworks to ensure data privacy, compliance, and security in the digital platform. Use AWS security tools and best practices to safeguard sensitive data and manage access control. Platform Performance & Optimization : Monitor and optimize platform performance, including the efficiency of data processing, data retrieval, and analytics workloads. Ensure low-latency and high-throughput data pipelines. Collaboration & Stakeholder Management : Collaborate closely with stakeholders across data engineering, data science, and business teams to align the platform architecture with business needs and evolving technological requirements. Skills & Qualifications: Required: Bachelors / Master’s degree in computer science , Engineering or a related field. 10+ years of experience in data architecture, data engineering, or a related field, with a strong background in designing scalable, cloud-based data platforms. Extensive experience with AWS services such as S3 , Redshift , Glue , Lambda , Kinesis , and RDS , with a deep understanding of cloud architecture patterns. Strong proficiency in Databricks , including experience with Apache Spark , Delta Lake , and MLflow for building data pipelines, managing large datasets, and supporting machine learning workflows. Expertise in data modelling techniques, including designing star/snowflake schemas, dimensional models, and ensuring data consistency and integrity across the platform. Experience with ETL/ELT processes , integrating data from a variety of sources, and optimizing data flows for performance. Proficiency in programming languages such as Python and SQL for data manipulation, automation, and data pipeline development. Strong knowledge of data governance and security practices, including data privacy regulations (GDPR, CCPA) and tools like AWS IAM , AWS KMS , and AWS CloudTrail . Experience with CI/CD pipelines and automation tools for deployment, testing, and monitoring of data architecture and pipelines. Preferred: Experience with real-time streaming data solutions such as Apache Kafka or AWS Kinesis within the Databricks environment. Experience with data lake management, particularly using AWS Lake Formation and Databricks Delta Lake for large-scale, efficient data storage and management. Soft Skills: Strong communication skills, with the ability to explain complex technical concepts to business leaders and stakeholders. Excellent problem-solving skills with the ability to architect complex, scalable data solutions. Leadership abilities with a proven track record of mentoring and guiding data teams. Collaborative mindset, capable of working effectively with cross-functional teams, including engineering, data science, and business stakeholders. Attention to detail, with a focus on building high-quality, reliable, and scalable data solutions.

Posted 2 weeks ago

Apply

10.0 - 12.0 years

25 - 27 Lacs

Hyderabad

Work from Office

BI Solution Design & Architecture 1. Design and implement scalable and efficient Power BI solutions. 2. Define the overall data architecture strategy aligned with business goals. 3. Ensure integration with existing data platforms, warehouses, and cloud services. 1. Dashboard & Report Development 1. Develop, manage, and maintain advanced Power BI dashboards and reports. 2. Translate complex data into clear, actionable visualizations. 3. Optimize performance and usability of reports for end users. Data Modeling & Integration 1. Create and manage data models using DAX and Power Query. 2. Integrate data from multiple sources including SQL, Excel, APIs, and cloud platforms. 3. Ensure data accuracy, consistency, and security across all reports. Collaboration & Stakeholder Engagement 1. Work closely with business stakeholders to gather requirements and deliver insights. 2. Collaborate with data engineers, analysts, and IT teams to ensure data readiness. 3. Provide training and support to Power BI users across the organization. Governance & Best Practices 1. Establish and enforce Power BI development standards and governance policies. 2. Ensure compliance with data privacy and security regulations. 3. Promote self-service BI while maintaining control over critical data assets. Continuous Improvement 1. Stay updated with the latest Power BI features and industry trends. 2. Recommend improvements to existing BI systems and processes. 3. Lead proof-of-concept initiatives for new BI tools or methodologies. Key Skills & Qualifications Strong experience in Power BI development and architecture. Proficiency in DAX, Power Query, and SQL. Solid understanding of data warehousing, ETL processes, and cloud platforms (e.g., Azure). Excellent communication and stakeholder management skills. Bachelor's or Masters degree in Computer Science, Information Systems, or related field. Notice Period- Serving or immediate joiner

Posted 2 weeks ago

Apply

2.0 - 7.0 years

5 - 8 Lacs

Jharkhand

Remote

Job Title:- Snowflake + Power BI Job Location:-Remote Job Summary: We are seeking a skilled and detail-oriented BI Developer / Data Analyst with strong expertise in Snowflake and Power BI , specifically in dataset creation, data modeling, and dashboard development . The ideal candidate will work closely with stakeholders to transform raw data into insightful, actionable business intelligence. Key Responsibilities: Design and build robust data models and datasets in Power BI to support business reporting and analytics. Develop and maintain complex SQL queries and data pipelines within Snowflake . Create reusable, scalable, and efficient data views and schemas to support self-service BI. Collaborate with business stakeholders to gather requirements, define KPIs, and design dashboards. Optimize data models for performance and scalability in both Snowflake and Power BI. Ensure data quality, governance, and security best practices are implemented. Troubleshoot data issues and resolve inconsistencies across systems. Maintain documentation of data definitions, metrics, and processes. Required Skills and Qualifications: 2-5+ years of experience in data analytics, BI development, or data engineering. Hands-on experience with Snowflake (SQL development, data warehousing concepts). Proven expertise in Power BI (Power Query, DAX, data modeling, report/dashboard development). Strong SQL skills ability to write, optimize, and troubleshoot complex queries. Experience working with large datasets and designing efficient data pipelines. Solid understanding of dimensional modeling , star/snowflake schemas , and data normalization . Familiarity with data governance, security, and compliance frameworks. Strong analytical and communication skills, with the ability to present complex data clearly.

Posted 2 weeks ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Lucknow

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 2 weeks ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Ludhiana

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 2 weeks ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Ludhiana

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 2 weeks ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Lucknow

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 2 weeks ago

Apply

2.0 - 6.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Profile • Design, develop, and optimize database solutions with a focus on SQL based development and data transformation • Develop code based on reading and understanding business and functional requirements following the Agile process • Produce high-quality code to meet all project deadlines and ensuring the functionality matches the requirements •Analyze and resolve issues found during the testing or pre-production phases of the software delivery lifecycle; coordinating changes with project team leaders and cross-work team members • Provide technical support to project team members and responding to inquiries regarding errors or questions about programs •Interact with architects, other tech leads, team members and project manager as required to address technical and schedule issues. •Suggest and implement process improvements for estimating, development and testing processes. •BS Degree in Computer Science or applicable programming area of study •A minimum of 2 years prior work experience working with an application or database development; must demonstrate experience delivering systems and projects from inception through implementation •Strong experience with SQL development on SQL Server and/or Oracle •Proficiency in SQL and PL/SQL, including writing queries, stored procedures, and performance tuning •Familiarity with data modelling and database design principles •Experience working in Agile/Scrum environments is preferred •Understand Asynchronous and Synchronous transactions and processing. Experience with JMS, MDBs, MQ is a plus •Experience with Snowflake, Python, data warehousing technologies, data pipelines, or cloud-based data platforms is a plus •Excellent communication skills •Strong system/ technical analysis skills •Self-motivation with an ability to prioritize multiple tasks •Ability to develop a strong internal network across the platform •Excellent collaboration, communications, negotiation, and conflict resolution skills •Ability to think creatively and seek optimum solutions •Ability to grasp loosely defined concepts and transform them into tangible results and key deliverables •Very strong problem-solving skills •Diagnostic skills with the ability to analyze technical, business and financial issues and options •Ability to infer from previous examples, willingness to understand how an application is put together •Action-oriented, with the ability to quickly deal with c

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies