Jobs
Interviews

951 Olap Jobs - Page 12

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

1 - 5 Lacs

Chennai

On-site

Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. ͏ Hands-on experience in data modelling for both OLTP and OLAP systems. In-depth knowledge of Conceptual, Logical, and Physical data modelling. Strong understanding of indexing, partitioning, and data sharding with practical experience. Experience in identifying and addressing factors affecting database performance for near-real-time reporting and application interaction. Proficiency with at least one data modelling tool (preferably DB Schema). Functional knowledge of the mutual fund industry is a plus. Familiarity with GCP databases like Alloy DB, Cloud SQL, and Big Query. Willingness to work from Chennai (office presence is mandatory) Chennai customer site, requiring five days of on-site work each week. Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform. Experience: 5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 3 weeks ago

Apply

0 years

4 - 5 Lacs

Noida

On-site

City/Cities Noida Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available No Posted Date 07-Jul-2025 Job ID 10629 Description and Requirements Involvement in solution planning Convert business specifications to technical s pecifications. Write clean codes and review codes of the project team members (as applicable) Adhere to Agile Delivery model. Able to solve L3 application related issues. Should be able to scale up on new technologies. Should be able document project artifacts. Technical Skills: Database and data warehouse skills, Object Oriented Programming, Design Patterns, and development knowledge. Azure Cloud experience with Cloud native development as well as migration of existing applications. Hands-on development and implementation experience in Azure Data Factory, Azure Databricks, Azure App services and Azure Service Bus. Agile development and DevSecOps understanding for end to end development life cycle is required. Experience in cutting edge OLAP cube technologies like Kyligence would be a plus Preferably worked in financial domain About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Job Summary: We are seeking a highly skilled Business Intelligence Developer with proven experience in SSIS , SSRS , SSAS , and Microsoft BI Stack to join our Business Intelligence and Analytics team. The ideal candidate will have a strong background in SQL Server, data integration, reporting, and analytics, with a passion for transforming business data into actionable insights. Key Responsibilities: Design, develop, and maintain ETL packages using SQL Server Integration Services (SSIS) to load data from various sources into data warehouses. Build and deploy interactive and paginated reports using SQL Server Reporting Services (SSRS) for operational and executive reporting. Design, model, and implement data cubes and tabular models using SQL Server Analysis Services (SSAS) (OLAP and/or Tabular). Utilize Microsoft Business Intelligence (MSBI) tools to deliver end-to-end data solutions. Work closely with business analysts and stakeholders to gather reporting and data visualization requirements. Write and optimize complex T-SQL queries, stored procedures, and views for data analysis and transformation. Perform database design, performance tuning, and maintenance on SQL Server databases . Participate in peer code reviews, testing, deployment, and support of solutions in production. Collaborate with cross-functional teams in Agile/Scrum environments. Required Qualifications: Minimum 4 years of hands-on experience with SQL Server database development, design, or administration. At least 3 years of experience in SSIS – designing and managing complex ETL workflows. At least 3 years of experience with SSRS – developing dashboards, KPIs, and report subscriptions. Minimum 3 years experience with SSAS – creating OLAP or Tabular data models. 2+ years working experience with the Microsoft Business Intelligence (MSBI) suite. Strong analytical and problem-solving skills. Effective communication skills and ability to collaborate with teams and clients. Additional Requirements: Open to work onsite 4 days a week Willing to work the 12:30 PM – 9:30 PM IST shift Comfortable attending in-person final round interviews Prior experience with EisnerAmper is a plus Must rate communication skills 4 or 5 on a 5-point scale Nice to Have: Knowledge of Power BI , DAX, and MDX Experience in financial or consulting domain Familiarity with version control tools and CI/CD pipelines for BI deployments

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

New Delhi, Delhi, India

Remote

DHIRA Company Overview DHIRA is a leading company specializing in intelligent transformation, where we leverage advanced AI/ML and data-driven solutions to revolutionize business operations. Unlike traditional digital transformation, which focuses on transaction automation, our intelligent transformation encompasses both transactional automation and deep analytics for comprehensive insights. Our expertise in data engineering, data quality, and master data management ensures robust and scalable AI/ML applications. Utilizing cutting-edge technologies across AWS, Azure, GCP, and on-premises Hadoop systems, we deliver efficient and innovative data solutions. Our vision is embodied in the Akashic platform, designed to provide seamless, end-to-end analytics. At DHIRA, we are committed to excellence, driving impactful contributions to the industry. Join us to be part of a dynamic team at the forefront of intelligent transformation Role- Data Architect – Evolution of Databases, Data Modeling, and Modern Data Practices Location : Bangalore, Remote Position Overview: We are seeking a Principal Data Architect with 5+ years of experience who has a comprehensive understanding of the evolution of databases , from OLTP to OLAP, and relational systems to NoSQL, Graph, and emerging Vector Databases . This role requires deep expertise in data modeling , from traditional ER modeling to advanced dimensional, graph, and vector schemas, along with a strong grasp of the history, best practices, and future trends in data management. The ideal candidate will bring both historical context and cutting-edge expertise to architect scalable, high-performance data solutions, driving innovation while maintaining strong governance and best practices. This is a leadership role that demands a balance of technical excellence, strategic vision, and team mentorship. Key Responsibilities: 1. Data Modeling Expertise: – Design and implement Entity-Relationship Models (ER Models) for OLTP systems, ensuring normalization and consistency. – Transition ER models into OLAP environments with robust dimensional modeling, including star and snowflake schemas. – Develop hybrid data models that integrate relational, NoSQL, Graph, and Vector Database schemas. – Establish standards for schema design across diverse database systems, focusing on scalability and query performance. 2. Database Architecture Evolution: – Architect solutions across the database spectrum: • Relational databases (PostgreSQL, Oracle, MySQL) • NoSQL databases (MongoDB, Cassandra, DynamoDB) • Graph databases (Neo4j, Amazon Neptune) • Vector databases (Pinecone, Weaviate, Milvus). – Implement hybrid data architectures combining OLTP, OLAP, NoSQL, Graph, and Vector systems for diverse business needs. – Ensure compatibility and performance optimization across these systems for real-time and batch processing. 3. Data Warehousing and Analytics: – Lead the development of enterprise-scale Data Warehouses capable of supporting advanced analytics and business intelligence. – Design high-performance ETL/ELT pipelines to handle structured and unstructured data with minimal latency. – Optimize OLAP systems for petabyte-scale data storage and low-latency querying. 4. Emerging Database Technologies: – Drive adoption of Vector Databases for AI/ML applications, enabling semantic search and embedding-based queries. – Explore cutting-edge technologies in data lakes, lakehouses, and real-time processing systems. – Evaluate and integrate modern database paradigms, ensuring scalability for future business requirements. 5. Strategic Leadership: – Define the organization’s data strategy , aligning with long-term goals and emerging trends. – Collaborate with business and technical stakeholders to design systems that balance transactional and analytical workloads. – Lead efforts in data governance, ensuring compliance with security and privacy regulations. 6. Mentorship and Innovation: – Mentor junior architects and engineers, fostering a culture of learning and technical excellence. – Promote innovation by introducing best practices, emerging tools, and modern methodologies in data architecture. – Act as a thought leader in database evolution, presenting insights to internal teams and external forums. Required Skills & Qualifications: • Experience: – 6+ years of experience in data architecture, with demonstrated expertise across OLTP, OLAP, NoSQL, Graph, and Vector databases. – Proven experience designing and implementing data models across relational, NoSQL, graph, and vector systems. – A strong understanding of the evolution of databases and their impact on modern data architectures. • Technical Proficiency: – Deep expertise in ER modeling , dimensional modeling, and schema design for modern database systems. – Proficient in SQL and query optimization for relational and analytical databases. – Hands-on experience with NoSQL databases like MongoDB, Cassandra, or DynamoDB. – Strong knowledge of Graph databases (Neo4j, Amazon Neptune) and Vector databases (Pinecone, Milvus, or Weaviate). – Familiarity with modern cloud-based DW platforms (e.g., Snowflake, BigQuery, Redshift) and lakehouse solutions. • Knowledge of Data Practices: – Historical and practical understanding of data practices, from schema-on-write to schema-on-read approaches. – Experience in implementing real-time and batch processing systems for diverse workloads. – Strong grasp of data lifecycle management, governance, and security practices. • Leadership and Communication: – Ability to lead large-scale data initiatives, balancing technical depth and strategic alignment. – Excellent communication skills to articulate complex ideas to technical and non-technical audiences. – Proven ability to mentor and upskill teams, fostering a collaborative environment. Preferred Skills: • Experience integrating Vector Databases into existing architectures for AI/ML workloads. • Knowledge of real-time streaming systems (Kafka, Pulsar) and their integration with modern databases. • Certifications in data-related technologies (e.g., AWS, GCP, Snowflake, Neo4j). • Hands-on experience with BI tools (e.g., Tableau, Power BI) and AI/ML platforms.

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Itanagar, Arunachal Pradesh, India

Remote

Job Title : Freelance Data Pipeline Engineer ETL, OLAP, Healthcare (HL7, FHIR) Employment Type : Full-Time Freelancer (Independent Contractor) Work Mode : Remote / Permanent Work From Home Work Schedule : Monday to Friday, 8 : 00 PM 5 : 00 AM IST (Night Shift) Compensation : 95K/ -100K -per month (subject to applicable TDS deductions) Contract Duration : Initial three-month engagement, extendable based on performance and project requirements Required Skills & Qualifications 8+ years of experience in data engineering, ETL development, and pipeline automation. Strong understanding of data warehousing and OLAP concepts. Proven expertise in handling healthcare data using HL7, FHIR, CCD/C-CDA. Proficiency in SQL and scripting languages such as Python or Bash. Experience with data integration tools (e.g., Apache NiFi, Talend, Informatica, SSIS). Familiarity with cloud data services (AWS, Azure, or GCP) is a plus. Strong knowledge of data quality assurance, data profiling, and data governance. Bachelors degree in Computer Science, Information Systems, or a related field (Masters preferred). For More Details Kindly share your resume : (ref:hirist.tech)

Posted 3 weeks ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Requirements Description and Requirements Involvement in solution planning Convert business specifications to technical s pecifications. Write clean codes and review codes of the project team members (as applicable) Adhere to Agile Delivery model. Able to solve L3 application related issues. Should be able to scale up on new technologies. Should be able document project artifacts. Technical Skills: Database and data warehouse skills, Object Oriented Programming, Design Patterns, and development knowledge. Azure Cloud experience with Cloud native development as well as migration of existing applications. Hands-on development and implementation experience in Azure Data Factory, Azure Databricks, Azure App services and Azure Service Bus. Agile development and DevSecOps understanding for end to end development life cycle is required. Experience in cutting edge OLAP cube technologies like Kyligence would be a plus Preferably worked in financial domain About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us!

Posted 3 weeks ago

Apply

2.0 - 4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Service Line: Digital Products and Services Overall years of experience: 2-4 years Relevant years of experience: 2-4 years Job Summary We are looking for a data visualiser in Power BI who is motivated to combine the arts of analytics and design Responsibilities will include translation of the design wireframes to dashboards in Power BI combining data from SharePoint Lists or Excel sheets Uses standard applications, like Excel and Power BI for providing reports required in team sites, workflow trackers etc created in SharePoint Understands and anticipates customer’s needs to meet or exceed expectations Works effectively in a team environment Technical Skills/Tools Requirement Essential Power BI, SharePoint, HTML5, CSS Desirable MS Excel, Access, SharePoint Modern Pages Design skills - Adobe Photoshop, Illustrator, XD Roles And Responsibilities Essential Design database architecture required for dashboards Possess in-depth knowledge about Power BI and its functionalities Translate business needs to technical specifications Design, build, and deploy BI solutions (e.g., reporting tools) Maintain and support data analytics platforms (e.g., MicroStrategy) Create tools to store data (e.g., OLAP cubes) Conduct unit testing and troubleshooting Evaluate and improve existing BI systems Collaborate with teams to integrate systems Develop and execute database queries and conduct analyses Create visualizations and reports for requested projects Develop and update technical documentation Desirable Excellent communication skills, both oral and written Ability to work with all levels in the organization Ability to communicate effectively with team and end users Good understanding of data analytics principles and ensuring that application will adhere to them Ability to manage competing priorities while working collaboratively with customers and stakeholders Self-motivated with the ability to thrive in a dynamic team environment, work across organizational departments and instill confidence with the client through work quality, time management, organizational skills, and responsiveness Experience with user interface design and prototyping Skill Set Requirement Matrix Technical Competency Power BI SharePoint Designer Adobe – Photoshop, illustrator, XD SP Workflow creation Operational competency Communication skills Result Oriented Listening Skills Customer Focus Time Management Planning EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 weeks ago

Apply

2.0 - 4.0 years

0 Lacs

Kochi, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Service Line: Digital Products and Services Overall years of experience: 2-4 years Relevant years of experience: 2-4 years Job Summary We are looking for a data visualiser in Power BI who is motivated to combine the arts of analytics and design Responsibilities will include translation of the design wireframes to dashboards in Power BI combining data from SharePoint Lists or Excel sheets Uses standard applications, like Excel and Power BI for providing reports required in team sites, workflow trackers etc created in SharePoint Understands and anticipates customer’s needs to meet or exceed expectations Works effectively in a team environment Technical Skills/Tools Requirement Essential Power BI, SharePoint, HTML5, CSS Desirable MS Excel, Access, SharePoint Modern Pages Design skills - Adobe Photoshop, Illustrator, XD Roles And Responsibilities Essential Design database architecture required for dashboards Possess in-depth knowledge about Power BI and its functionalities Translate business needs to technical specifications Design, build, and deploy BI solutions (e.g., reporting tools) Maintain and support data analytics platforms (e.g., MicroStrategy) Create tools to store data (e.g., OLAP cubes) Conduct unit testing and troubleshooting Evaluate and improve existing BI systems Collaborate with teams to integrate systems Develop and execute database queries and conduct analyses Create visualizations and reports for requested projects Develop and update technical documentation Desirable Excellent communication skills, both oral and written Ability to work with all levels in the organization Ability to communicate effectively with team and end users Good understanding of data analytics principles and ensuring that application will adhere to them Ability to manage competing priorities while working collaboratively with customers and stakeholders Self-motivated with the ability to thrive in a dynamic team environment, work across organizational departments and instill confidence with the client through work quality, time management, organizational skills, and responsiveness Experience with user interface design and prototyping Skill Set Requirement Matrix Technical Competency Power BI SharePoint Designer Adobe – Photoshop, illustrator, XD SP Workflow creation Operational competency Communication skills Result Oriented Listening Skills Customer Focus Time Management Planning EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 weeks ago

Apply

2.0 - 4.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Service Line: Digital Products and Services Overall years of experience: 2-4 years Relevant years of experience: 2-4 years Job Summary We are looking for a data visualiser in Power BI who is motivated to combine the arts of analytics and design Responsibilities will include translation of the design wireframes to dashboards in Power BI combining data from SharePoint Lists or Excel sheets Uses standard applications, like Excel and Power BI for providing reports required in team sites, workflow trackers etc created in SharePoint Understands and anticipates customer’s needs to meet or exceed expectations Works effectively in a team environment Technical Skills/Tools Requirement Essential Power BI, SharePoint, HTML5, CSS Desirable MS Excel, Access, SharePoint Modern Pages Design skills - Adobe Photoshop, Illustrator, XD Roles And Responsibilities Essential Design database architecture required for dashboards Possess in-depth knowledge about Power BI and its functionalities Translate business needs to technical specifications Design, build, and deploy BI solutions (e.g., reporting tools) Maintain and support data analytics platforms (e.g., MicroStrategy) Create tools to store data (e.g., OLAP cubes) Conduct unit testing and troubleshooting Evaluate and improve existing BI systems Collaborate with teams to integrate systems Develop and execute database queries and conduct analyses Create visualizations and reports for requested projects Develop and update technical documentation Desirable Excellent communication skills, both oral and written Ability to work with all levels in the organization Ability to communicate effectively with team and end users Good understanding of data analytics principles and ensuring that application will adhere to them Ability to manage competing priorities while working collaboratively with customers and stakeholders Self-motivated with the ability to thrive in a dynamic team environment, work across organizational departments and instill confidence with the client through work quality, time management, organizational skills, and responsiveness Experience with user interface design and prototyping Skill Set Requirement Matrix Technical Competency Power BI SharePoint Designer Adobe – Photoshop, illustrator, XD SP Workflow creation Operational competency Communication skills Result Oriented Listening Skills Customer Focus Time Management Planning EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 weeks ago

Apply

6.0 - 10.0 years

2 - 4 Lacs

Navi Mumbai

Work from Office

Were Hiring: FP&A (Financial Planning & Analysis) Professional Location: Navi Mumbai Experience: 6–10 years Cosette Pharmaceuticals, Inc. is building something special — and we’re looking for a passionate FP&A expert to join our growing team. What we’re looking for: Strong background in budgeting, forecasting, and financial analysis Proficiency with tools like Power BI or OLAP Experience in pharmaceutical or manufacturing industries (mandatory) A drive to simplify complexity, deliver insights, and enable smarter decisions At our core, we value: Empowering talent Precision Data-driven growth

Posted 3 weeks ago

Apply

0 years

8 - 9 Lacs

Bengaluru

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are inviting applications for the role of Assistant Manager, Power BI We're looking for someone would be responsible for delivery of all analytics led projects undertaken for this top priority customer. Projects would include the following, though not limited to, since it’s an evolving role: Reporting & Dashboarding (PBI), Direct Data Source connection to PBI dashboard, Value generating Analytic Use cases involving data science and Statistical forecasting / modeling. Responsibilities In this role, you'll be responsible for all the activities related to Power BI domain Intermediate years subject matter expert on the Microsoft technology stack - Microsoft Power BI, Azure, sophisticated analytics, and information management technology In this role, the resource will be responsible for rapid simulation and understanding of existing applications and technical environment, to convert business requirements into detailed functional requirements and crafting into a functional specifications' document. And also to take responsibility and ownership of components/processes within the team. Should understand databases – MS SQL Server, Oracle, Azure, Should have exposure to SQL Performance tuning, Data modeling, SSIS, SSRS Working knowledge of Business Intelligence Systems & Tools (Microsoft Power BI, SQL Server Reporting Services, Tableau, Qlikview, Spotfire etc.) Working understanding of different data, reporting and analytics tools and how they are used within an organization Solid understanding of databases like SQL Server, SQL Azure, Oracle etc. A deep understanding of, and ability to use and explain all aspects of, relational database design, multifaceted database design, OLTP, OLAP, critical metrics, Scorecards, and Dashboards Ability to recommend architecture standard methodologies related to ETL, ELT, BI, and the life-cycle of an EDW solution Good to have MS Excel whiz skills - Power Query, Power View, Power Pivot Strong financial analytical skills and problem solving skills. Healthcare & Life Sciences industry experience a plus Qualifications we seek in you! Minimum qualifications Knowledge of standard methodologies and IT operations in an always-up, Preferred qualifications Bachelor's / Graduation / Equivalent Preferably Masters in Business Administration Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at www.genpact.com and on X, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Assistant Manager Primary Location India-Bangalore Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jul 7, 2025, 2:02:04 AM Unposting Date Ongoing Master Skills List Operations Job Category Full Time

Posted 3 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title Tech Delivery Lead Role Summary & Role Description The lead is responsible for providing strong tech design and implementation platform for the Market Data Hub (MDH), and will be responsible for End to End delivery of the project, ensuring the strategy supports the current and future business needs. Lead implementations to provide scalable design, state of the art Development & engineering practices, Automation for Quality assurance & Maintenance in production. Work closely with global & regional Business and Technology stakeholders Participate in org wide initiatives Documenting the detailed data & application architecture for both current and target state Understand and implement data privacy requirements Understand the application architecture and data flow/transformation Capturing the logical & physical data models for both current state and the target state Setting the strategy for architecture to support the Business and IT strategies while maintaining the data architecture principles Lead architecture governance for the portfolio and provide subject matter expert inputs to design decisions across teams within the portfolio Manage holistic roadmap of architecture change initiatives across the coordinating requirements across different initiatives Be a key stakeholder and advisor in all new strategic data initiatives and ensure alignment to the enterprise wide data strategy Build a framework of principles to ensure data integrity across the business Build and maintain appropriate Data Architecture artifacts including; Entity Relationship Models, Data dictionary, taxonomy to aid data traceability Provide technical oversight to Solution Architects in creating business driven solutions adhering to the enterprise architecture and data governance standards Develop key performance measures for data integration and quality Support third party data suppliers in developing specifications that are congruent with the Enterprise data architecture Act on ad-hoc duties as assigned. Core/Must Have Skills Proficiency in Application architecture, Engineering practices & Big data implementations Should be able to develop, maintain, and optimize data pipelines and workflows using Databricks. Strong engineering skills, including knowledge of languages such as Java (Hive, Apache, Hadoop) and Scala (Apache Spark, Kafka). Understanding of Data Management tools, to mine data, data masking techniques, automate test data generation for test execution Strong in Market and reference Data domain Strong understanding of data pipeline (extract, transform and load) processes and the supporting technologies such as Oracle PL/SQL, Unix Shell Scripting, Python, Hadoop Spark & Scala, Databricks, AWS, Azure etc. Excellent problem solving and data modelling skills (logical, physical, sematic and integration models) including normalization, OLAP / OLTP principles and entity relationship analysis Experience of creating and implementing data strategies that align with business objectives. Excellent communication and presentational skills, confident and methodical approach, and able to work within a team environment. Enthusiastic and proactive with the strive to “make things happen” Ability to identify gaps in processes and introduce new tools and standards to increase efficiency and productivity Ability to work to deadlines in a fast paced environment Ability to take ownership and initiative Self-motivated and ability to influence others Good To Have Skills Domain Knowledge on Market Data Work Schedule Hybrid Keywords (If any) Job ID: R-769091

Posted 3 weeks ago

Apply

4.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Role: ETL Test Engineer Experience range: 4-10 years Location: Hyderabad/Bangalore/Chennai Job description: NOTE: Relevant experience in ETL Testing and SQL Experience is mandatory 1.Min 4 to 6 yrs of Exp in ETL Testing. 2.SQL - Expert level of knowledge in core concepts of SQL and query. 3. ETL Automation - Experience in Datagap, Good to have experience in tools like Informatica, Talend and Ab initio. 4. Experience in query optimization, stored procedures/views and functions. 5.Strong familiarity of data warehouse projects and data modeling. 6. Understanding of BI concepts - OLAP vs OLTP and deploying the applications on cloud servers. 7.Preferably good understanding of Design, Development, and enhancement of SQL server DW using tools (SSIS,SSMS, PowerBI/Cognos/Informatica, etc.) 8. Azure DevOps/JIRA - Hands on experience on any test management tools preferably ADO or JIRA. 9. Agile concepts - Good experience in understanding agile methodology (scrum, lean etc.) 10.Communication - Good communication skills to understand and collaborate with all the stake holders within the project

Posted 3 weeks ago

Apply

12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Position Summary... What you'll do... About Team: Enterprise Business Services is invested in building a compact, robust organization that includes service operations and technology solutions for Finance, People, Associate Digital Experience. Our team is responsible for design and development of solution that knows our consumer’s needs better than ever by predicting what they want based on unconstrained demand, and efficiently unlock strategic growth, economic profit, and wallet share by orchestrating intelligent, connected planning and decisioning across all functions. We interact with multiple teams across the company to provide scalable robust technical solutions. This role will play crucial role in overseeing the planning, execution and delivery of complex projects within team Walmart’s Enterprise Business Services (EBS) is a powerhouse of several exceptional teams delivering world-class technology solutions and services making a profound impact at every level of Walmart. As a key part of Walmart Global Tech, our teams set the bar for operational excellence and leverage emerging technology to support millions of customers, associates, and stakeholders worldwide. Each time an associate turns on their laptop, a customer makes a purchase, a new supplier is onboarded, the company closes the books, physical and legal risk is avoided, and when we pay our associates consistently and accurately, that is EBS. Joining EBS means embarking on a journey of limitless growth, relentless innovation, and the chance to set new industry standards that shape the future of Walmart. What you'll do: Manage a high performing team of 8-10 engineers who work across multiple technology stacks including Java and Mainframe Drive design, development, implementation and documentation Establish best engineering and operational excellence practices based product, engineering and scrum metrics Interact with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community. Engage with Product and Business stakeholders to drive the agenda, set the priorities and deliver scalable and resilient products. Work closely with the Architects and cross functional teams and follow established practices for the delivery of solutions meeting QCD (Quality, Cost & Delivery) within the established architectural guidelines. Work with senior leadership to chart out the future roadmap of the products Participate in hiring, mentoring and building high performing agile teams. Participating in organizational events like hackathons, demodays etc. and be the catalyst towards the success of those events Interact closely for requirements with Business owners and technical teams both within India and across the globe.. What you'll bring: Bachelor's/Master’s degree in Computer Science, engineering, or related field, with minimum 12+ years of experience in software development and at least 5+ years of experience in managing engineering teams. Have prior experience in managing high performing agile technology teams. Hands on experience building Java-based backend systems is a must, and experience of working in cloud based solutions is desirable Proficient in Javascript, NodeJS, ReactJS and NextJS. A good understanding of CS Fundamentals, Microservices, Data Structures, Algorithms & Problem Solving Should have exposed to CI/CD development environments/tools including, but not limited to, Git, Maven, Jenkins. Strong in writing modular and testable code and test cases (unit, functional and integration) using frameworks like JUnit, Mockito, and Mock MVC Should be experienced in microservices architecture. Posses good understanding of distributed concepts, common design principles, design patterns and cloud native development concepts. Hands-on experience in Spring boot, concurrency, garbage collection, RESTful services, data caching services and ORM tools. Experience working with Relational Database and writing complex OLAP, OLTP and SQL queries. Experience in working with NoSQL Databases like cosmos DB. Experience in working with Caching technology like Redis, Mem cache or other related Systems. Good knowledge in Pub sub system like Kafka. Experience utilizing monitoring and alert tools like Prometheus, Splunk, and other related systems and excellent in debugging and troubleshooting issues. Exposure to Containerization tools like Docker, Helm, Kubernetes. Knowledge of public cloud platforms like Azure, GCP etc. will be an added advantage. About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. That’s what we do at Walmart Global Tech. We’re a team of software engineers, data scientists, cybersecurity expert's and service professionals within the world’s leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate at scale, impact millions and reimagine the future of retail. Flexible, hybrid work We use a hybrid way of working with primary in office presence coupled with an optimal mix of virtual presence. We use our campuses to collaborate and be together in person, as business needs require and for development and networking opportunities. This approach helps us make quicker decisions, remove location barriers across our global team, be more flexible in our personal lives. Benefits Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, PTO, health benefits, and much more. Belonging We aim to create a culture where every associate feels valued for who they are, rooted in respect for the individual. Our goal is to foster a sense of belonging, to create opportunities for all our associates, customers and suppliers, and to be a Walmart for everyone. At Walmart, our vision is "everyone included." By fostering a workplace culture where everyone is—and feels—included, everyone wins. Our associates and customers reflect the makeup of all 19 countries where we operate. By making Walmart a welcoming place where all people feel like they belong, we’re able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Equal Opportunity Employer Walmart, Inc., is an Equal Opportunities Employer – By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing unique styles, experiences, identities, ideas and opinions – while being inclusive of all people. Minimum Qualifications... Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications. Minimum Qualifications:Option 1: Bachelor's degree in computer science, computer engineering, computer information systems, software engineering, or related area and 5 years’ experience in software engineering or related area. Option 2: 7 years’ experience in software engineering or related area. 2 years’ supervisory experience. Preferred Qualifications... Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications. Master’s degree in computer science, computer engineering, computer information systems, software engineering, or related area and 3 years' experience in software engineering or related area. Bachelors: Computer Engineering Primary Location... Rmz Millenia Business Park, No 143, Campus 1B (1St -6Th Floor), Dr. Mgr Road, (North Veeranam Salai) Perungudi , India R-2224618

Posted 3 weeks ago

Apply

0.0 - 4.0 years

0 Lacs

Chennai, Tamil Nadu

On-site

Job Information Date Opened 07/07/2025 Job Type Full time Industry Software Product City Chennai State/Province Tamil Nadu Country India Zip/Postal Code 600017 Job Description Pando is a global leader in supply chain technology, building the world's quickest time-to-value Fulfillment Cloud platform. Pando’s Fulfillment Cloud provides manufacturers, retailers, and 3PLs with a single pane of glass to streamline end-to-end purchase order fulfillment and customer order fulfillment to improve service levels, reduce carbon footprint, and bring down costs. As a partner of choice for Fortune 500 enterprises globally, with a presence across APAC, the Middle East, and the US, Pando is recognized as a Technology Pioneer by the World Economic Forum (WEF), and as one of the fastest growing technology companies by Deloitte. Role Overview As a Junior Data Warehouse Engineer at Pando, you’ll work within the Data & AI Services team to support the design, development, and maintenance of data pipelines and warehouse solutions. You'll collaborate with senior engineers and cross-functional teams to help deliver high-quality analytics and reporting solutions that power key business decisions. This is an excellent opportunity to grow your career by learning from experienced professionals and gaining hands-on experience with large-scale data systems and supply chain technologies. Key Responsibilities Assist in building and maintaining scalable data pipelines using tools like PySpark and SQL-based ETL processes. Support the development and maintenance of data models for dashboards, analytics, and reporting. Help manage parquet-based data lakes and ensure data consistency and quality. Write optimized SQL queries for OLAP database systems and support data integration efforts. Collaborate with team members to understand business data requirements and translate them into technical implementations. Document workflows, data schemas, and data definitions for internal use. Participate in code reviews, team meetings, and training sessions to continuously improve your skills Requirements 2–4 years of experience working with data engineering or ETL tools (e.g., PySpark, SQL, Airflow). Solid understanding of SQL and basic experience with OLAP or data warehouse systems. Exposure to data lakes, preferably using Parquet format. Understanding of basic data modeling principles (e.g., star/snowflake schema). Good problem-solving skills and a willingness to learn and adapt. Ability to work effectively in a collaborative, fast-paced team environment. Preferred Qualifications Experience working with cloud platforms (e.g., AWS, Azure, or GCP). Exposure to low-code data tools or modular ETL frameworks. Interest or prior experience in the supply chain or logistics domain. Familiarity with dashboarding tools like Power BI, Looker, or Tableau.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Hi {fullName} There is an opportunity for AZURE DATA ENGINEER AT NOIDA for which WALKIN interview is there on 12th july 25 between 9:00 AM TO 12:30 PM PLS SHARE below details to mamidi.p@tcs.com with subject line as AZURE DATA ENGINEER if you are interested Email id: Contact no: Total EXP: Preferred Location: CURRENT CTC: EXPECTED CTC: NOTICE PERIOD: CURRENT ORGANIZATION: HIGHEST QUALIFICATION THAT IS FULL TIME : HIGHEST QUALIFICATION UNIVERSITY: ANY GAP IN EDUCATION OR EMPLOYMENT: IF YES HOW MANY YEARS AND REASON FOR GAP: ARE U AVAILABLE FOR WALKIN INTERVIEW at NOIDA ON 12TH JULY 25(YES/NO): We will share a mail to you by tom Night if you are shortlisted. 1 Role** Azure Data Engineer Databricks 2 Required Technical Skill Set** Azure SQL, Azure SQL DW, Azure Data Lake Store , Azure Data Factory Must have Implementation, and operations of OLTP, OLAP, DW technologies such as Azure SQL, Azure SQL DW, , Azure Data Lake Store , Azure Data Factory and understanding of Microsoft Azure PaaS features. Azure Cloud ,Azure Databricks, Data Factory knowledge are good to have, otherwise any cloud exposure Ability to gather requirements from client side and explain to tech team members. Resolve conflicts in terms of bandwidth or design issues. Good understanding of data modeling, data analysis, data governance. Very good communication skills and client handling skills

Posted 4 weeks ago

Apply

5.0 - 9.0 years

20 - 30 Lacs

Pune, Bengaluru

Hybrid

Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required:- Proficient in Data Modelling 5-10 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only Outstation candidate's will not be considered

Posted 4 weeks ago

Apply

7.0 - 11.0 years

20 - 25 Lacs

Noida, Kolkata, Pune

Work from Office

Proficient in application, data, and infrastructure architecture disciplines. Advanced knowledge of architecture, design, and business processes. Hands-on experience with AWS. Proficiency in modern programming languages such as Python and Scala. Expertise in Big Data technologies like Hadoop, Spark, and PySpark. Experience with deployment tools for CI/CD, such as Jenkins. Design and develop integration solutions involving Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions. Apply system development lifecycle methodologies, such as Waterfall and Agile. Understand and implement data architecture and modeling practices, including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional modeling, and metadata modeling. Utilize knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development, and Big Data solutions. Work collaboratively in teams to develop meaningful relationships and achieve common goals. Strong analytical skills with deep expertise in SQL. Solid understanding of Big Data concepts, particularly with Spark and PySpark/Scala. Experience with CI/CD using Jenkins. Familiarity with NoSQL databases. Excellent communication skills.

Posted 4 weeks ago

Apply

2.0 - 6.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Not Applicable Specialism SAP Management Level Senior Associate & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC Learn more about us . s Create Service Offerings Create Customer Presentation (Pitch Deck based on Service Offerings) Help in winning the new deals and to manage the same Help in taking the offerings to market along with the Sales team Help in account mining /farming to grow the customer accounts Ensure the services are delivered as per contractual agreement with the Customer Required Skills and Qualifications Architecting and overall endtoend design, deployment, and delivery of Azure Data Platforms, across Data Lakes, Data Warehouses, Data Lake houses, pipelines, Databricks, BI and Data Analytics solutions Remaining up to date in new and emerging technologies Working with clients to develop data technology strategy and roadmaps, and plan delivery Oversight and support of delivery team outputs Data modelling, design, and build Infrastructure as Code delivery Enforcing technical architecture and documentation standards, policies, and procedures Analysing, implementing, and resolving issues with existing Azure Data Platforms Working with business experts and customers to understand business needs, and translate business requirements into reporting and data analytics functionality Assisting in scoping, estimation, and task planning for assigned projects Following the project work plans to meet functionality requirements, project objectives and timelines Providing accurate and complete technical architectural documents Addressing customer queries and issues in a timely manner Providing mentoring and hands on guidance to other team members Experience in designing and implementing Azure Data solutions using services such as o Azure Synapse Analytics o Azure Databricks o Azure Data Lake Storage Gen2 o Azure SQL Database o Azure Data Factory o Azure DevOps o Azure Stream Analytics o Azure Blob storage o Azure Cosmos DB o ARM templates Familiar with Microsoft Power BI Familiar with Azure Purview An understanding of Master Data Management and Data Governance frameworks Familiar with Infrastructure as Code approaches and implementations Familiar with development approaches such as CI/CD Familiar with Azure DevOps Strong communication and collaboration skills Strong analytical thinking and problemsolving skills Ability to work as a team member and leader in a diverse technical environment Be customerservice oriented Be able to work in a fastpaced, changing environment. Proficient in spoken and written English Willing to travel abroad when required Graduatelevel education in Computer Science or a relevant field, or a widely recognised professional qualification at a comparable level Formal training and/or certification on related technologies is highly valued Minimum of three years working in a similar role Knowledge of Common Data Models/Industry Data Models/Synapse Analytics Database templates will be considered an asset Experience in OLAP technology and the Microsoft onpremises BI Stack (SSIS/SSRS/SSAS) will be useful but is not compulsory The role is highly technical and requires a robust understanding and handson expertise of Microsoft Azure cloud technologies, data architecture and modelling concepts. The role also demands strong analytical, problem solving, and planning skills. The role requires strong communication and collaboration skills and the motivation to achieve results in a dynamic business environment. Required Technical Skill set SQL, Azure Data Factory+ Azure Data Lake + Azure SQL+ Azure Synapse+ Mandatory Skill sets Data factory, Databricks, SQL DB, Python, ADLS, PySpark Preferred Skills sets PBI, Power APP stack, Data Modelling , AWS lambda, Glue, GMR, Airflow, Kinesis, Redshift Years of Experience required 610 Education Qualification Bachelors degree in Computer Science, Engineering, or a related field. Education Degrees/Field of Study required Bachelor of Engineering Degrees/Field of Study preferred Required Skills Azure Data Factory, Databricks Platform, PySpark, Python (Programming Language), Structured Query Language (SQL) Data Modeling, Power BI Travel Requirements Government Clearance Required?

Posted 4 weeks ago

Apply

2.0 - 6.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Not Applicable Specialism SAP Management Level Senior Associate & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . s Create Service Offerings Create Customer Presentation (Pitch Deck based on Service Offerings) Help in winning the new deals and to manage the same Help in taking the offerings to market along with the Sales team Help in account mining /farming to grow the customer accounts Ensure the services are delivered as per contractual agreement with the Customer Required Skills and Qualifications Architecting and overall endtoend design, deployment, and delivery of Azure Data Platforms, across Data Lakes, Data Warehouses, Data Lake houses, pipelines, Databricks, BI and Data Analytics solutions Remaining up to date in new and emerging technologies Working with clients to develop data technology strategy and roadmaps, and plan delivery Oversight and support of delivery team outputs Data modelling, design, and build Infrastructure as Code delivery Enforcing technical architecture and documentation standards, policies, and procedures Analysing, implementing, and resolving issues with existing Azure Data Platforms Working with business experts and customers to understand business needs, and translate business requirements into reporting and data analytics functionality Assisting in scoping, estimation, and task planning for assigned projects Following the project work plans to meet functionality requirements, project objectives and timelines Providing accurate and complete technical architectural documents Addressing customer queries and issues in a timely manner Providing mentoring and hands on guidance to other team members Experience in designing and implementing Azure Data solutions using services such as o Azure Synapse Analytics o Azure Databricks o Azure Data Lake Storage Gen2 o Azure SQL Database o Azure Data Factory o Azure DevOps o Azure Stream Analytics o Azure Blob storage o Azure Cosmos DB o ARM templates Familiar with Microsoft Power BI Familiar with Azure Purview An understanding of Master Data Management and Data Governance frameworks Familiar with Infrastructure as Code approaches and implementations Familiar with development approaches such as CI/CD Familiar with Azure DevOps Strong communication and collaboration skills Strong analytical thinking and problemsolving skills Ability to work as a team member and leader in a diverse technical environment Be customerservice oriented Be able to work in a fastpaced, changing environment. Proficient in spoken and written English Willing to travel abroad when required Graduatelevel education in Computer Science or a relevant field, or a widely recognised professional qualification at a comparable level Formal training and/or certification on related technologies is highly valued Minimum of three years working in a similar role Knowledge of Common Data Models/Industry Data Models/Synapse Analytics Database templates will be considered an asset Experience in OLAP technology and the Microsoft onpremises BI Stack (SSIS/SSRS/SSAS) will be useful but is not compulsory The role is highly technical and requires a robust understanding and handson expertise of Microsoft Azure cloud technologies, data architecture and modelling concepts. The role also demands strong analytical, problem solving, and planning skills. The role requires strong communication and collaboration skills and the motivation to achieve results in a dynamic business environment. Required Technical Skill set SQL, Azure Data Factory+ Azure Data Lake + Azure SQL+ Azure Synapse+ Mandatory Skill sets Data factory, Databricks, SQL DB, Python, ADLS, PySpark Preferred Skills sets PBI, Power APP stack, Data Modelling , AWS lambda, Glue, GMR, Airflow, Kinesis, Redshift Years of Experience required 610 Education Qualification Bachelors degree in Computer Science, Engineering, or a related field. Education Degrees/Field of Study required Bachelor of Engineering Degrees/Field of Study preferred Required Skills Azure Data Factory, Databricks Platform, PySpark, Python (Programming Language), Structured Query Language (SQL) Data Modeling, Power BI Travel Requirements Government Clearance Required?

Posted 4 weeks ago

Apply

2.0 - 5.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Not Applicable Specialism SAP Management Level Senior Associate & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . Responsibilities Create Service Offerings Create Customer Presentation (Pitch Deck based on Service Offerings) Help in winning the new deals and to manage the same Help in taking the offerings to market along with the Sales team Help in account mining /farming to grow the customer accounts Ensure the services are delivered as per contractual agreement with the Custome Architecting and overall endtoend design, deployment, and delivery of Azure Data Platforms, across Data Lakes, Data Warehouses, Data Lake houses, pipelines, Databricks, BI and Data Analytics solutions Remaining up to date in new and emerging technologies Working with clients to develop data technology strategy and roadmaps, and plan delivery Oversight and support of delivery team outputs Data modelling, design, and build Infrastructure as Code delivery Enforcing technical architecture and documentation standards, policies, and procedures Analysing, implementing, and resolving issues with existing Azure Data Platforms Working with business experts and customers to understand business needs, and translate business requirements into reporting and data analytics functionality Assisting in scoping, estimation, and task planning for assigned projects Following the project work plans to meet functionality requirements, project objectives and timelines Providing accurate and complete technical architectural documents Addressing customer queries and issues in a timely manner Providing mentoring and hands on guidance to other team members Experience in designing and implementing Azure Data solutions using services such as o Azure Synapse Analytics o Azure Databricks o Azure Data Lake Storage Gen2 o Azure SQL Database o Azure Data Factory o Azure DevOps o Azure Stream Analytics o Azure Blob storage o Azure Cosmos DB o ARM templates Familiar with Microsoft Power BI Familiar with Azure Purview An understanding of Master Data Management and Data Governance frameworks Familiar with Infrastructure as Code approaches and implementations Familiar with development approaches such as CI/CD Familiar with Azure DevOps Strong communication and collaboration skills Strong analytical thinking and problemsolving skills Ability to work as a team member and leader in a diverse technical environment Be customerservice oriented Be able to work in a fastpaced, changing environment. Proficient in spoken and written English Willing to travel abroad when required Graduatelevel education in Computer Science or a relevant field, or a widely recognised professional qualification at a comparable level Formal training and/or certification on related technologies is highly valued Minimum of three years working in a similar role Knowledge of Common Data Models/Industry Data Models/Synapse Analytics Database templates will be considered an asset Experience in OLAP technology and the Microsoft onpremises BI Stack (SSIS/SSRS/SSAS) will be useful but is not compulsory The role is highly technical and requires a robust understanding and handson expertise of Microsoft Azure cloud technologies, data architecture and modelling concepts. The role also demands strong analytical, problem solving, and planning skills. The role requires strong communication and collaboration skills and the motivation to achieve results in a dynamic business environment. Required Technical Skill set SQL, Azure Data Factory+ Azure Data Lake + Azure SQL+ Azure Synapse+ Mandatory skill sets Data factory, Databricks, SQL DB, Python, ADLS, PySpark Preferred skill sets PBI, Power APP stack, Data Modelling , AWS lambda, Glue, GMR, Airflow, Kinesis, Redshift Years of experience required 610 Education qualification Bachelors degree in Computer Science, Engineering, or a related field. Education Degrees/Field of Study required Bachelor of Engineering, Bachelor of Technology Degrees/Field of Study preferred Required Skills Azure Data Factory DevOps Travel Requirements Government Clearance Required?

Posted 4 weeks ago

Apply

2.0 years

4 - 9 Lacs

Bengaluru

On-site

About Lowe's Lowe’s is a FORTUNE® 100 home improvement company serving approximately 16 million customer transactions a week in the United States. With total fiscal year 2024 sales of more than $83 billion, Lowe’s operates over 1,700 home improvement stores and employs approximately 300,000 associates. Based in Mooresville, N.C., Lowe’s supports the communities it serves through programs focused on creating safe, affordable housing, improving community spaces, helping to develop the next generation of skilled trade experts and providing disaster relief to communities in need. For more information, visit Lowes.com. Lowe's India Lowe’s India, the Global Capability Center of Lowe’s Companies Inc., is a hub for driving our technology, business, analytics, and shared services strategy. Based in Bengaluru with over 4,500 associates, it powers innovations across omnichannel retail, AI/ML, enterprise architecture, supply chain, and customer experience. From supporting and launching homegrown solutions to fostering innovation through its Catalyze platform, Lowe’s India plays a pivotal role in transforming home improvement retail while upholding strong commitment to social impact and sustainability. For more information, visit Lowes India About the Team This team is responsible for building and maintaining critical enterprise platforms and frameworks that empower internal developers and drive key business functions. Their work spans the entire software development lifecycle and customer journey, encompassing tools like an Internal Developer Portal, front-end frameworks, A/B testing and customer insights platforms, workflow and API management solutions, a Customer Data Platform (CDP), and robust testing capabilities including performance and chaos testing. This team is instrumental in providing the foundational technology that enables innovation, efficiency, and a deep understanding of their customers. Job Summary: As a Software Engineer with a focus on data engineering, you will play a critical role in building and optimizing our data infrastructure. Your responsibilities will include designing and implementing scalable data pipelines, working with various data processing frameworks, and ensuring data quality and availability for analytics and decision-making processes. Roles & Responsibilities: Core Responsibilities: Data Engineering: Design, build, and maintain robust data pipelines to ingest, process, and analyze large datasets. System Design & Architecture: Contribute to the design and implementation of distributed systems that ensure high availability, low latency, and fault tolerance at scale. Testing & Quality Assurance: Implement best practices for data governance, data quality, and security in data engineering workflows. Debugging & Troubleshooting: Investigate and resolve bugs and performance issues in both development and production environments. Code Review & Mentorship: Participate in code reviews, share knowledge with peers, and learn from senior engineers to continuously improve code quality and team productivity. Technical Documentation: Document system design, APIs, and service behavior to facilitate maintainability and cross-team collaboration. Cross-functional Collaboration: Work with engineering and analytics teams to understand their needs, ensure compliance, and provide tools that empower them to make data-driven decisions. Innovation & Learning: Stay updated with emerging trends in data engineering and continuously suggest improvements to our existing systems. Operational Excellence: Monitor data pipeline performance and troubleshoot issues to ensure high availability and reliability of data services. Years of Experience: 2 years of experience in software development or a related field 2 years of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC) through iterative agile development 2 years experience in data engineering or software development with a strong emphasis on data processing. Required Minimum Qualifications: 2 years of experience writing technical documentation in a software environment and developing and implementing business systems within an organization Bachelor's degree in computer science, computer information systems, or related field (or equivalent work experience in lieu of degree) Skill Set Required Strong understanding of Data Structures and Algorithms, Object-Oriented Programming (OOP), and Aspect-Oriented Programming (AOP). Solid knowledge of Software Design Patterns, Distributed Systems, and Microservices Architecture. Proficiency in programming languages such as Python, Java, or Scala. Experience working with OLAP databases like Druid and ClickHouse. Familiarity with tools such as Kafka, Git/Bitbucket, the ELK Stack, Prometheus, and Grafana. Secondary Skills (desired) Exposure to containerization and orchestration tools (e.g., Docker, Kubernetes). Understanding of performance tuning and system reliability engineering concepts. Knowledge of security best practices and compliance in data handling. Familiarity with large-scale data processing frameworks (e.g., Apache Flink or Spark). Experience with cloud platforms, such as Google Cloud Platform (GCP) or others. Lowe's is an equal opportunity employer and administers all personnel practices without regard to race, color, religious creed, sex, gender, age, ancestry, national origin, mental or physical disability or medical condition, sexual orientation, gender identity or expression, marital status, military or veteran status, genetic information, or any other category protected under federal, state, or local law.

Posted 4 weeks ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Noida

Work from Office

Sr. MongoDB Administrator Job Description: Optimize performance through indexing, query tuning, and resource allocation. Conduct benchmarking and stress testing to evaluate database performance and stability under load. Perform real-time monitoring and health checks for all databases. Create, review, and optimize complex NoSQL queries. Implement and oversee replication, sharding, and backup drills. Develop and maintain disaster recovery plans with regular testing. Lead database migration projects with minimal downtime. Design and manage database architecture for OLTP and OLAP systems. Manage integration with Big Data systems, Data Lakes, Data Marts, and Data Warehouses. Administer databases on Linux environments and AWS cloud (RDS, EC2, S3, etc.). Use Python scripting for automation and custom DBA tools. Collaborate with DevOps, engineering, and analytics teams. Experience Range: 3 - 6 years Educational Qualifications: Any graduation Skills Required: MongoDB , AWS , BigData

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Chennai

Work from Office

Project description You will be working in a cutting edge, banking environment which is now ongoing thorough upgrade program. You will be responsible for translating business data and overall data into reusable and adjustable dashboards used by senior business managers. Responsibilities Design and develop complex T-SQL queries and stored procedures for data extraction, transformation, and reporting. Build and manage ETL workflows using SSIS to support data integration across multiple sources. Create interactive dashboards and reports using Tableau and SSRS for business performance monitoring. Develop and maintain OLAP cubes using SSAS for multidimensional data analysis. Collaborate with business and data teams to understand reporting requirements and deliver scalable BI solutions. Apply strong data warehouse architecture and modeling concepts to build efficient data storage and retrieval systems. Perform performance tuning and query optimization for large datasets and improve system responsiveness. Ensure data quality, consistency, and integrity through robust validation and testing processes. Maintain documentation for data pipelines, ETL jobs, and reporting structures. Stay updated with emerging Microsoft BI technologies and best practices to continuously improve solutions. SkillsMust have At least 6 years of experience with T-SQL and SQL Server (SSIS and SSRS exposure is a must. Proficient with Tableau, preferably with at least 4 years of experience creating dashboards. Experience working with businesses and delivering dashboards to senior management. Working within Data Warehouse architecture experience is a must. Exposure to Microsoft BI. Nice to have N/A

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 6+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies