Jobs
Interviews

5881 Data Warehousing Jobs - Page 15

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 11.0 years

3 - 7 Lacs

New Delhi, Chennai, Bengaluru

Work from Office

Must have 2+ years of working experience in designing ETL flows with Pentaho Data Integration Develop ETL jobs based on Requirements given. Self-Efficient to understand the existing ETL Jobs / Reports. Analyses data from various sources, including databases and flat files. Discuss with the customers/end users and gather requirements. Build Customize ETL/BI reports. Identify the Problematic Areas in ETL /Loads and need to fix by Performance Improvement. Must have 3+ years of experience in developing and tuning complex SQL queries using Oracle, PostgreSQL or other leading DBMS. Need to Write the SQL Scripts to validate the data. Need to Support Daily Loads able to fix the issues within SLA. Hands on data migration Able to setup reporting for a new client Able to write Linux script Provide production support Create Source to Target (Mapping) Documents Required Technical and Professional Expertise: Hands on experience on ETL Tools Like Pentaho. Hands on experience on any one Reporting tools like Pentaho BI, Microsoft Power BI, Oracle BI, Tableau etc. Experience in a data warehousing role with a solid understanding of data warehousing approaches and best practices. Strong Hands-on Experience writing SQL Scripts to analyse and validate the Data. Should have expert knowledge in writing SQL commands, queries, and stored procedures Strong knowledge of DB and DW concepts. Functional Knowledge/understanding on Finance, reconciliation, Customer service, Pricing modules etc. Excellent SQL, PL/SQL, XML, JSON and Database Skills Good to have some experience with Python or Javascript Good to have some knowledge on Kubernetes, Ansible Good to have some knowledge on Linux script Education UG: Any Graduate Key Skills ETL , Pentaho, PL/SQL, Etl Development, ETL Tool, SQL

Posted 5 days ago

Apply

8.0 - 10.0 years

30 - 32 Lacs

Hyderabad

Work from Office

Candidate Specifications: Candidate should have 9+ years of experience. Candidates should have 9+ years of experience in Python and Pyspark Candidate should have strong experience in AWS and PLSQL. Candidates should be strong in Data management with data governance and data streaming along with data lakes and data-warehouse Candidates should also have exposure in Team handling and stakeholder management skills. Candidate should have excellent in written and verbal communication skills. Contact Person: Sheena Rakesh

Posted 5 days ago

Apply

15.0 - 20.0 years

45 - 55 Lacs

Pune, Bengaluru

Work from Office

Job Description for a Senior Solution Architect – Data & Cloud Job Title: Senior Solution Architect – Data & Cloud Experience: 12+ Years Location: Hybrid / Remote Practice: Migration Works Employment Type: Full-time About Company: We are a data and analytics firm that provides the strategies, tools, capability and capacity that businesses need to turn their data into a competitive advantage. USEReady partners with cloud and data ecosystem leaders like Tableau, Salesforce, Snowflake, Starburst and Amazon Web Services, and has been named Tableau partner of the year multiple times. Headquartered in NYC, the company has 450 employees across offices in the U.S., Canada, India and Singapore and specializes in financial services. USEReady’s deep analytics expertise, unique player/coach approach and focus on fast results makes the company a perfect partner for a cloud-first, digital world. About the Role: We are looking for a highly experienced Senior Solution Architect to join our Migration Works practice, specializing in modern data platforms and visualization tools. The ideal candidate will bring deep technical expertise in Tableau, Power BI, AWS, and Snowflake, along with strong client-facing skills and the ability to design scalable, high-impact data solutions. You will be at the forefront of driving our AI driven migration and modernization initiatives, working closely with customers to understand their business needs and guiding delivery teams to success. Key Responsibilities: Solution Design & Architecture Lead the end-to-end design of cloud-native data architecture using AWS, Snowflake, and Azure stack. Translate complex business requirements into scalable and efficient technical solutions. Architect modernization strategies for legacy BI systems to cloud-native platforms. Client Engagement Conduct technical discussions with enterprise clients and stakeholders to assess needs and define roadmap. Act as a trusted advisor during pre-sales and delivery phases, showcasing technical leadership and consultative approach. Migration & Modernization Design frameworks for data platform migration (from on-premise to cloud), data warehousing, and analytics transformation. Support estimation, planning, and scoping of migration projects. Team Leadership & Delivery Oversight Guide and mentor delivery teams across geographies, ensuring solution quality and alignment to client goals. Support delivery by providing architectural oversight and resolving design bottlenecks. Conduct technical reviews, define best practices, and uplift the team’s capabilities. Required Skills & Experience: 15+ years of progressive experience in data and analytics, with at least 5 years in solution architecture roles. Strong hands-on expertise in: Tableau And Power BI – dashboard design, visualization architecture, and migration from legacy BI tools. AWS – S3, Redshift, Glue, Lambda, and data pipeline components. Snowflake – Architecture, Snowconvert, data modeling, security, and performance optimization. Experience in migrating legacy platforms (e.g., Cognos, BO, Qlik) to modern BI/Cloud-native stacks like Tableau and Power BI. Proven ability to interface with senior client stakeholders, understand business problems, and propose architectural solutions. Strong leadership, communication, and mentoring skills. Familiarity with data governance, security, and compliance in cloud environments. Preferred Qualifications: AWS/Snowflake certifications are a strong plus. Exposure to data catalog, lineage tools, and metadata management. Knowledge of ETL/ELT tools such as Talend, Informatica, or dbt. Prior experience working in consulting or fast-paced client services environments. What We Offer: Opportunity to work on cutting-edge AI led cloud and data migration projects. A collaborative and high-growth environment with room to shape future strategy. Access to learning programs, certifications, and technical leadership exposure.

Posted 5 days ago

Apply

4.0 - 9.0 years

18 - 20 Lacs

Hyderabad

Hybrid

Position Job Title: Business Intelligence Developer Reports To: Business Intelligence Manager Primary Purpose The BI developer applies business and advanced technical expertise in meeting business data and reporting needs. The position supports business planning by compiling, visualizing, and analyzing business and statistical data from UCWs information systems. The BI developer liaises with various stakeholders across the university to provide them with the data, reporting, and analysis required to make informed data-driven decisions. The Business Intelligence developer will work on projects that will have a significant impact on student, faculty, and staff experience. Specific Responsibilities The BI Developer will at various times be responsible for the following as well as other related duties as assigned to support the business objectives and purpose of the Company. Design relational databases to support business enterprise applications and physical data modeling according to business requirements Gather requirements from various business departments at UCW and transform them into self-serve reports/dashboards for the various business units using Power BI Understand ad-hoc data requirements and convert it into reporting deliverables Contribute to driving reporting automation and simplification to free up time for in-depth analyses Collaborate with internal and external team members, including system architects, software developers, database administrators, and design analysts, to find creative and innovative approaches to enrich business data Provide business and technical expertise for the analytics process, tools, and applications for the University. Identify opportunities that improve data accuracy and efficiency of our processes. Contributes to the development of training materials, documenting processes, and delivering sessions. Develop strategies for data modeling, design, transport, and implementation to meet requirements for metadata management, operational data stores, and ELT/ETL environments Create and test data models for a variety of business data, applications, database structures, and metadata tables to meet operational goals for performance and efficiency Research modern technologies, data modeling methods, and information management systems and recommend changes to company data architectures Contribute to a team environment where all team members consistently experience a sense of belonging and inclusion Position Requirements Competencies: Demonstrated experience in creating complex data models and developing insightful reports and dashboards using Microsoft Power BI Must possess advanced skills in using DAX queries for Power BI Connecting data sources, importing data, cleaning, and transforming data for Business intelligence Knowledge of database management principles and experience working with SQL/MySQL databases Ability to implement row-level security on data along with an understanding of application security layer models in Power BI Ability to translate business requirements into informative reports/visuals A good sense of design that will help communicate data using visually compelling reports and dashboards Experience in ETL (Extract, Transform and Load) processes an asset Experience in being involved in the development of a data warehouse is an asset Data analysis and visualization skills using Python and/or R an asset Strong analytical, problem-solving, and data analysis skills Ability to ensure organizational data privacy and confidentiality Understanding of statistical analysis techniques such as correlation and regression Demonstrated ability to collect data from a variety of sources, synthesize data, produce reports, and make recommendations Ability to manage multiple concurrent tasks and competing demands Education and Experience: Bachelors or masters degree in business, Information Systems, Computer Science, or related discipline Demonstrated experience in using Power BI to create reports, dashboards, and self serve analytics Must have 3+ years of experience in data-specific roles especially in the use of Power BI, Excel, and SQL

Posted 6 days ago

Apply

8.0 - 12.0 years

15 - 25 Lacs

Gurugram, Delhi / NCR

Work from Office

Skills Requirements: Must have - Python, Pytest, SQL, ETL Automation, AWS, Data warehousing Good to have - Java, Selenium, API Automation, Rest Assured, Postman Proficient in Automation testing tools such as Selenium or Appium • Knowledge of scripting languages such as Python or JavaScript • Experience with Test Automation frameworks and best practices • Familiarity with Agile testing methodologies • Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. • Must understand the company's long-term vision and align with it. • Should be open to new ideas and be willing to learn and develop new skills. • Should also be able to work well under pressure and manage multiple tasks and priorities.

Posted 6 days ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

This role will be a part of the Performance Reporting & Analytics team based in Bengaluru. You will collaborate with Finance Business Partners across LSEG Groups Divisions and Functions to prepare and deliver insightful analysis that supports and drives decision-making. Your responsibilities will include designing and developing high-quality interactive dashboards and scorecards in Power BI for Senior Leadership Teams. You will write complex SQL queries and custom views, develop ETL queries with M language in Power Query, and build data models based on Star Schema or snowflake in Power BI. Connecting and mashup of different sources like SQL, Excels, CSV's, API's, Online sources, etc (On-Premises/Cloud) will be a crucial aspect of your role. You will streamline processes end-to-end, automate data flows, and maintain Dashboards, incorporating changes in hierarchy, metrics, and organization alignments. Additionally, you will be responsible for access management of users, building advanced DAX functions/calculations, and monitoring Dashboard performance and usage statistics. To excel in this role, you should have 9+ years of experience in Dashboarding/Data visualization roles, with specific exposure to Microsoft Power BI for at least 7 years. Strong knowledge of Power BI, ETL, SQL, data modeling, and database concepts are essential. Experience in Power Automate, Power Apps, and other data visualization tools would be beneficial. You will work in a dynamic, fast-paced environment, delivering valuable insights and focusing on continuous improvement. Your ability to collaborate effectively with diverse teams and understand requirements clearly will be key to success in this role. At LSEG, we are committed to driving financial stability, empowering economies, and enabling sustainable growth. Our values of Integrity, Partnership, Excellence, and Change guide our decision-making and actions every day. Working with us means being part of a collaborative and creative culture that encourages new ideas and is dedicated to sustainability. LSEG offers a range of benefits and support, including healthcare, retirement planning, paid volunteering days, and wellbeing initiatives. Join us in our mission to re-engineer the financial ecosystem, support sustainable economic growth, and create inclusive economic opportunities.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

As an Analytics Manager specializing in Power BI, Python, and Tableau within the Insurance Domain, you will play a crucial role in designing, developing, and implementing Power BI dashboards. Your expertise in Power BI, Python, Tableau, and SQL is essential for this role. You will be responsible for leading, mentoring, and developing a team of data analysts and data scientists. Your key responsibilities will include providing strategic direction for analytical projects, defining and implementing the company's data analytics strategy, and conducting complex data analysis to identify trends and patterns. You will oversee the development of interactive dashboards, reports, and visualizations to make data insights easily accessible to stakeholders. Ensuring data integrity and consistency across systems, collaborating with cross-functional teams, and staying current with the latest data analytics trends and technologies are also important aspects of this role. Additionally, you will lead data-driven projects from initiation to execution, managing timelines, resources, and risks effectively. To be successful in this role, you should have a Bachelor's degree in data science, Statistics, Computer Science, Engineering, or a related field, with at least 10 years of experience in data analysis and 2 years in a managerial or leadership position. Proficiency in data analysis and visualization tools such as SQL, Python, R, Tableau, and Power BI is required. Strong knowledge of data modeling, ETL processes, and database management, along with exceptional problem-solving and critical thinking skills, are essential. Effective communication of complex technical concepts to non-technical stakeholders, proven experience in managing and growing a team of data professionals, strong project management skills, and domain knowledge in insurance will be advantageous for this role. If you are looking for a challenging opportunity to lead data analytics projects, collaborate with diverse teams, and drive business insights within the Insurance Domain, this role is ideal for you.,

Posted 1 week ago

Apply

4.0 - 10.0 years

0 Lacs

punjab

On-site

The solution designer (SAP HR) role requires a professional with at least 10+ years of overall development experience, specifically 4 years dedicated to solution design. The ideal candidate should possess a robust background in SAP HR design and development within the banking sector. Key Requirements: - Minimum of 10 years of professional experience, with a focus on design and architecture for at least 4 years. - Demonstrated expertise in SAP HR design. - Preferable experience in Regulatory and Risk reporting. - Proficiency in designing Data Models, Free Form, and Reports based on Regulatory Report Requirements. - Ability to comprehend Functional documents and collaborate with Business Analysts/Users to ensure alignment with solution design. - Active participation representing the Technology Team in design and architecture discussions. - Proficiency in Data warehousing and PL SQL. - Strong background in Regulatory Reporting and Data Warehousing. - Familiarity and exposure to cloud concepts, particularly AWS. - Self-driven individual capable of delivering results within tight timelines.,

Posted 1 week ago

Apply

5.0 - 10.0 years

0 Lacs

maharashtra

On-site

You are a highly skilled and motivated Lead Data Scientist / Machine Learning Engineer sought to join a team pivotal in the development of a cutting-edge reporting platform. This platform is designed to measure and optimize online marketing campaigns effectively. Your role will involve focusing on data engineering, ML model lifecycle, and cloud-native technologies. You will be responsible for designing, building, and maintaining scalable ELT pipelines, ensuring high data quality, integrity, and governance. Additionally, you will develop and validate predictive and prescriptive ML models to enhance marketing campaign measurement and optimization. Experimenting with different algorithms and leveraging various models will be crucial in driving insights and recommendations. Furthermore, you will deploy and monitor ML models in production and implement CI/CD pipelines for seamless updates and retraining. You will work closely with data analysts, marketing teams, and software engineers to align ML and data solutions with business objectives. Translating complex model insights into actionable business recommendations and presenting findings to stakeholders will also be part of your responsibilities. Qualifications & Skills: Educational Qualifications: - Bachelors or Masters degree in Computer Science, Data Science, Machine Learning, Artificial Intelligence, Statistics, or related field. - Certifications in Google Cloud (Professional Data Engineer, ML Engineer) is a plus. Must-Have Skills: - Experience: 5-10 years with the mentioned skillset & relevant hands-on experience. - Data Engineering: Experience with ETL/ELT pipelines, data ingestion, transformation, and orchestration (Airflow, Dataflow, Composer). - ML Model Development: Strong grasp of statistical modeling, supervised/unsupervised learning, time-series forecasting, and NLP. - Programming: Proficiency in Python (Pandas, NumPy, Scikit-learn, TensorFlow/PyTorch) and SQL for large-scale data processing. - Cloud & Infrastructure: Expertise in GCP (BigQuery, Vertex AI, Dataflow, Pub/Sub, Cloud Storage) or equivalent cloud platforms. - MLOps & Deployment: Hands-on experience with CI/CD pipelines, model monitoring, and version control (MLflow, Kubeflow, Vertex AI, or similar tools). - Data Warehousing & Real-time Processing: Strong knowledge of modern data platforms for batch and streaming data processing. Nice-to-Have Skills: - Experience with Graph ML, reinforcement learning, or causal inference modeling. - Working knowledge of BI tools (Looker, Tableau, Power BI) for integrating ML insights into dashboards. - Familiarity with marketing analytics, attribution modeling, and A/B testing methodologies. - Experience with distributed computing frameworks (Spark, Dask, Ray). Location: - Bengaluru Brand: - Merkle Time Type: - Full time Contract Type: - Permanent,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You should have strong SQL knowledge including query performance, tuning, and understanding of database structure. It is important to have knowledge of data modeling principles. Additionally, familiarity with at least one ETL tool (such as SSIS, Talend, abinitio, etc.) is required. Your analytical skills should be strong with attention to detail, and you should be passionate about complete data structure and problem solving. You should be able to quickly grasp new data tools and concepts. Experience in data warehouse environments with knowledge of data architecture patterns is essential. You will be responsible for preparing and/or directing the creation of system test plans, test criteria, and test data. You will also determine system design and prepare work estimates for developments or changes for multiple work efforts. Creating or updating system documents such as BRD, functional documents, ER-diagrams, etc. will be part of your responsibilities. Experience in handling and supporting mid-scale teams is required. Excellent communication skills are necessary to manage the team effectively and understand and resolve resource issues promptly to enhance team productivity. Virtusa values teamwork, quality of life, and professional and personal development. By joining Virtusa, you become part of a global team of 27,000 individuals who are committed to your growth. You will have the opportunity to work on exciting projects, utilize state-of-the-art technologies, and advance your career. At Virtusa, collaboration and a team-oriented environment are highly valued. We provide a dynamic platform for great minds to exchange ideas, innovate, and strive for excellence.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Platform Engineer at our company, you will be a key member of the Data Platform team, leveraging your expertise in data platforms, data warehousing, and data administration. Your role will involve defining and aligning data platform strategies across the organization, ensuring optimal performance of our data lake and data warehouse environment to meet business needs effectively. Responsibilities: - Define and align Data Lake and Data Warehouse usage strategies organization-wide - Design, develop, and maintain Data Lake and Data Warehouse solutions - Perform data lake and data warehouse administration tasks including user management, security, and performance tuning - Collaborate with data architects, business stakeholders, and other teams to understand data requirements and establish guidelines and processes - Define cost attribution, optimization, and monitoring strategies for the data platform - Develop and maintain data models, schemas, and database objects - Monitor and optimize data lake and data warehouse performance for high availability and reliability - Stay updated with the latest advancements in data platforms and related technologies - Provide mentorship and guidance to junior team members Minimum Qualifications: - Bachelor's degree in Computer Science, Information Technology, or related field - 5+ years of experience in Data Platform engineering - 3+ years of hands-on experience with data lake and data warehouse - Experience with cloud platforms like AWS, Data Warehouse solutions like Snowflake - Experience with provisioning and maintenance of Spark, Presto, Kubernetes clusters - Familiarity with open table formats like Iceberg, metadata stores like HMS, GDC, etc. - Strong problem-solving skills and attention to detail - Excellent communication and collaboration skills Preferred Qualifications: - Snowflake Experience - Proficiency in coding languages such as Python - Familiarity with data visualization tools like Looker - Experience with Agile/Scrum methodologies Join us at Autodesk, where every day brings new opportunities to create amazing things with our software. Embrace our culture that values diversity, belonging, and innovation, and be part of a team that shapes the future. Discover a rewarding career that helps build a better world for all.,

Posted 1 week ago

Apply

12.0 - 16.0 years

0 Lacs

pune, maharashtra

On-site

The Data Engineering Technology Lead position is a senior role responsible for establishing and implementing new or revised data platform ecosystems and programs in coordination with the Technology team. Your primary objective is to lead the data engineering team in implementing business requirements. Your responsibilities will include designing, building, and maintaining batch or real-time data pipelines in the data platform, as well as optimizing the data infrastructure for accurate extraction, transformation, and loading of data from various sources. You will develop ETL processes to extract and manipulate data, automate data workflows, and prepare data in Data Warehouses for stakeholders. Collaboration with data scientists and functional leaders to deploy machine learning models and building data products for analytics teams will be essential. Ensuring data accuracy, integrity, privacy, security, and compliance through quality control procedures, monitoring data system performance, and implementing optimization strategies are also part of your role. You will partner with management teams to integrate functions, identify necessary system enhancements, and resolve high impact problems. In addition, you will provide expertise in applications programming, ensure application design aligns with architecture, and develop knowledge of integrating business areas to accomplish goals. Qualifications for this position include 12+ years of experience in a data engineering role, problem-solving skills, leadership abilities, service orientation, and the ability to work in a fast-paced environment. Proficiency in technical tools, interpersonal skills, and a Bachelor's/University degree (Master's preferred) are also required. The position encompasses two key responsibilities: 1. Data Engineering: - Building Data Pipelines: Creating systems for collecting, storing, and transforming data from various sources. - Data Collection and Management: Gathering data, ensuring quality, and making it accessible for analysis. - Data Transformation: Converting raw data into usable formats using ETL processes for analysis and reporting. 2. Data Governance and Compliance: - Documentation Data Lineage: Documenting data requirements for data governance within Citi Information Security Office. - Data Models and Flow Diagrams: Implementing data flow diagrams to understand data movement and conducting gap analysis for remediation. - Data Models Understanding: Translating business needs into logical and physical data models to ensure data integrity and consistency. - Data Analysis and Profiling: Analyzing data sources, identifying data issues, and ensuring data quality. This role falls under the Technology job family group and Applications Support job family, and it is a full-time position at Citi. If you need accommodations due to disability, refer to the Accessibility at Citi policy. Review Citis EEO Policy Statement and the Know Your Rights poster for further information.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As an ideal candidate for this role, you will be responsible for designing and architecting scalable Big Data solutions within the Hadoop ecosystem. Your key duties will include leading architecture-level discussions for data platforms and analytics systems, constructing and optimizing data pipelines utilizing PySpark and other distributed computing tools, and transforming business requirements into scalable data models and integration workflows. It will be crucial for you to guarantee the high performance and availability of enterprise-grade data processing systems. Additionally, you will play a vital role in mentoring development teams and offering guidance on best practices and performance tuning. Your must-have skills for this position include architect-level experience with the Big Data ecosystem and enterprise data solutions, proficiency in Hadoop, PySpark, and distributed data processing frameworks, as well as hands-on experience in SQL and data warehousing concepts. A deep understanding of data lake architecture, data ingestion, ETL, and orchestration tools, along with experience in performance optimization and large-scale data handling will be essential. Your problem-solving, design, and analytical skills should be excellent. While not mandatory, it would be beneficial if you have exposure to cloud platforms such as AWS, Azure, or GCP for data solutions, and possess knowledge of data governance, data security, and metadata management. Joining our team will provide you with the opportunity to work on cutting-edge Big Data technologies, gain leadership exposure, and be directly involved in architectural decisions. This role offers stability as a full-time position within a top-tier tech team, ensuring a work-life balance with a 5-day working schedule. (ref:hirist.tech),

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

You are a strategic thinker passionate about driving solutions in financial analysis. You have found the right team. As a Data Domain Architect Lead - Vice President within the Finance Data Mart team, you will be responsible for overseeing the design, implementation, and maintenance of data marts to support our organization's business intelligence and analytics initiatives. You will collaborate with business stakeholders to gather and understand data requirements, translating them into technical specifications. Leading the development of robust data models to ensure data integrity and consistency, you will oversee the implementation of ETL processes to populate data marts with accurate and timely data. Your role will involve optimizing data mart performance and scalability, ensuring high availability and reliability, while mentoring and guiding a team of data mart developers. Responsibilities: - Lead the design and development of data marts, ensuring alignment with business intelligence and reporting needs. - Collaborate with business stakeholders to gather and understand data requirements, translating them into technical specifications. - Develop and implement robust data models to support data marts, ensuring data integrity and consistency. - Oversee the implementation of ETL processes to populate data marts with accurate and timely data. - Optimize data mart performance and scalability, ensuring high availability and reliability. - Monitor and troubleshoot data mart issues, providing timely resolutions and improvements. - Document data mart structures, processes, and procedures, ensuring knowledge transfer and continuity. - Mentor and guide a team of data mart developers if needed, fostering a collaborative and innovative work environment. - Stay updated with industry trends and best practices in data warehousing, data modeling, and business intelligence. Required qualifications, capabilities, and skills: - Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. - Extensive experience in data warehousing, data mart development, and ETL processes. - Strong expertise in Data Lake, data modeling, and database management systems (e.g., Databricks, Snowflake, Oracle, SQL Server, etc.). - Leadership experience, with the ability to manage and mentor a team. - Excellent problem-solving skills and attention to detail. - Strong communication and interpersonal skills to work effectively with cross-functional teams. Preferred qualifications, capabilities, and skills: - Experience with cloud-based data solutions (e.g., AWS, Azure, Google Cloud). - Familiarity with advanced data modeling techniques and tools. - Knowledge of data governance, data security, and compliance practices. - Experience with business intelligence tools (e.g., Tableau, Power BI, etc.). Candidates must be able to physically work in our Bengaluru Office in the evening shift from 2 PM to 11 PM IST. The specific schedule will be determined and communicated by direct management.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

coimbatore, tamil nadu

On-site

As a Power Platform Specialist, you will be responsible for designing, developing, and implementing solutions using Power Apps, Power Automate, Data Verse, and Power BI. Your primary goal will be to enhance operational efficiency, productivity, and business intelligence within the organization. To excel in this role, you must possess a comprehensive understanding of the Power Platform, coupled with exceptional analytical and problem-solving abilities. Additionally, your aptitude for collaborative work with cross-functional teams will be crucial for success. Key Skills and Qualifications: - Bachelor's degree in computer science, Information Technology, or a related field. - Minimum of 4 years of hands-on experience with Power Apps, Power Automate, Data Verse, and Power BI. - Profound knowledge of the Power Platform, encompassing its functionalities, constraints, and best practices. - Strong analytical, problem-solving, and communication skills. - Proven ability to collaborate effectively with cross-functional teams. - Proficiency in Azure, Dynamics, and SharePoint. - Understanding of data modelling, data warehousing, and business intelligence. - Certification in Power Apps, Power Automate, Data Verse, or Power BI. - Familiarity with agile development methodologies such as Scrum and Kanban. - Experience working with cloud-based technologies and services. Your role will involve shaping the future of Additive Manufacturing in the MedTech industry. You will engage with global leadership to drive a transformative, clean-sheet initiative forward. As part of this dynamic space, you will be responsible for both the strategic roadmap and the tactical delivery of projects, ensuring alignment with organizational objectives.,

Posted 1 week ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

About PhonePe Limited: PhonePe Limited is a company that offers a diverse portfolio of businesses in India, including the distribution of financial products such as Insurance, Lending, and Wealth, as well as new consumer tech ventures like Pincode - hyperlocal e-commerce and Indus AppStore Localized App Store for the Android ecosystem. The company's vision is to provide every Indian with an equal opportunity to enhance their progress by facilitating the flow of money and access to services. Culture: At PhonePe, we are committed to creating an environment where you can bring your best self to work every day. We believe in empowering our employees and entrusting them to make the right decisions. You will have ownership of your work right from the beginning, allowing you to solve complex problems and execute tasks quickly. If you are passionate about building platforms that impact millions of people, collaborating with top minds in the industry, and realizing your aspirations with purpose and speed, we welcome you to join us! PhonePe is currently looking for enthusiastic BI Engineers with 1-3 years of experience, particularly in Qlik Sense, to enhance data availability and insights on a large scale. If you are motivated by data and constantly strive for improvement, we invite you to be a part of our innovative team! In this role, you will have the opportunity to: - Work with extensive datasets and address real-world data modeling challenges to ensure scalability, flexibility, and efficiency in reporting and analytics. - Develop interactive Qlik dashboards to assist various stakeholders in making data-driven decisions. - Assist in constructing and optimizing data models that bolster robust reporting and analytics capabilities while seamlessly integrating with the organization's data architecture. - Collaborate with stakeholders to comprehend data requirements and ensure timely access to the relevant data. - Utilize modern open-source tools and technologies in the data processing stack, with the chance to experiment and implement automation to enhance data workflows. - Contribute to designing and establishing scalable data warehousing pipelines to process and aggregate raw data into actionable insights. - Enhance your expertise in BI and data visualization best practices in a dynamic learning environment. To be eligible for this position, you should possess: - 1-3 years of BI experience in relevant roles, preferably in a product-based company. - Proficiency in Qlik Sense development, dashboard design, and performance optimization. - Ability to create and manage Qlik Sense reports, charts, and visualizations effectively. - Understanding of data warehousing, modeling, and data flow concepts. - Strong knowledge in SQL, with Hive experience being advantageous. - Capability to translate intricate business requirements into interactive dashboards and reports. - Strong collaborative skills and execution rigor. PhonePe Full-Time Employee Benefits (Not applicable for Intern or Contract Roles): - Insurance Benefits: Medical Insurance, Critical Illness Insurance, Accidental Insurance, Life Insurance - Wellness Program: Employee Assistance Program, Onsite Medical Center, Emergency Support System - Parental Support: Maternity Benefit, Paternity Benefit Program, Adoption Assistance Program, Day-care Support Program - Mobility Benefits: Relocation benefits, Transfer Support Policy, Travel Policy - Retirement Benefits: Employee PF Contribution, Flexible PF Contribution, Gratuity, NPS, Leave Encashment - Other Benefits: Higher Education Assistance, Car Lease, Salary Advance Policy To learn more about PhonePe, visit our blog.,

Posted 1 week ago

Apply

5.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

As a PowerBI Lead/Senior Developer at our Hyderabad office (Hybrid), you will play a key role in designing, developing, and publishing interactive dashboards using Power BI. With 5 to 10 years of experience, you will be responsible for understanding business requirements and translating them into technical specifications for Power BI reports and dashboards. Your expertise in integrating data from various sources, building and optimizing data models, and developing ETL processes will be crucial for success. Your responsibilities will include collaborating with stakeholders, business analysts, and data engineers to ensure report alignment with business goals. Additionally, you will implement row-level security and access controls within Power BI reports, perform data validation, and monitor Power BI service performance. Staying up-to-date with Power BI updates, best practices, and industry trends will be essential to deliver high-quality solutions. To excel in this role, you should have a Bachelor's degree in Computer Science, Information Systems, or a related field, along with 5+ years of hands-on experience in Power BI report and dashboard development. Proficiency in DAX, Power Query, and data modeling, as well as strong SQL skills, are required. Experience in integrating Power BI with cloud data sources and familiarity with Power BI Gateway, Power BI Report Server, and Workspace Management will be beneficial. Strong analytical and problem-solving skills, excellent communication, and stakeholder management abilities are essential for effective collaboration. Preferred qualifications include Microsoft Power BI certifications, exposure to other BI tools, and experience working in Agile or Scrum teams. Join us to leverage your skills in DAX, stakeholder management, Azure, ETL, SQL, data modeling, and more to drive impactful insights and solutions through Power BI.,

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Senior Visualization Engineer at IDP Education Services India LLP, you will be instrumental in shaping and implementing the data visualization strategy. Your role will involve designing and developing interactive and scalable dashboards and reports using tools like Tableau/Power BI to effectively communicate insights to stakeholders. You will collaborate closely with data and business stakeholders to enable data-driven decision-making across the enterprise. Your key responsibilities will include translating business requirements into technical specifications and intuitive visualizations, creating reusable datasets to support scalable self-service analytics, and connecting to various internal and external data sources for data integration. You will work towards ensuring accurate data representation and governance alignment in the reporting layer. Moreover, you will engage with key stakeholders such as commercial, product, marketing, and client success teams to define and communicate business requirements for tracking and reporting. Your ability to present insights and prototypes effectively with clear storytelling and data narratives will be crucial in this role. Additionally, you are expected to contribute to standards and best practices in data visualization and BI development, stay updated with industry trends and technology innovations in data and analytics, and identify opportunities for automation, optimization, and simplification of analytics solutions. To be successful in this role, you should have a degree or equivalent qualification and at least 7 years of experience in data visualization and business intelligence. Proficiency in Tableau or equivalent tools, strong SQL skills, and familiarity with data warehousing and analytics concepts are essential. A certification in Power BI/Tableau would be desirable. Basic understanding of data engineering concepts and excellent proficiency in English (both spoken and written) are also required for effective engagement with cross-functional stakeholders and clear conveyance of ideas. Join us at IDP and be part of a global team dedicated to delivering success to students, test takers, and partners worldwide through trusted relationships, digital technology, and customer research. Visit www.careers.idp.com to learn more.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

You have robust SQL expertise and experience in developing complex PLSQL queries. Additionally, you have significant hands-on experience in generating Tableau reports. Familiarity with Kyvos is considered an asset. In terms of technical expertise, you possess a deep understanding of data warehousing concepts, data lake technologies, and BI tools. You have the ability to design and implement scalable data solutions using the Kyvos platform. Your role involves engaging with potential and existing clients to understand their data challenges, present Kyvos solutions, and address technical concerns. You are responsible for crafting customized data analysis solutions that leverage Kyvos capabilities to address specific business requirements. It is essential for you to stay updated on the latest Kyvos features and functionalities, including new integrations and cloud deployments. As a solution engineer, you are primarily responsible for the technical design and implementation of Kyvos solutions for clients, acting as the primary technical contact.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Data Pipeline Architect at our company, you will be responsible for designing, developing, and maintaining optimal data pipeline architecture. You will monitor incidents, perform root cause analysis, and implement appropriate actions to ensure smooth operations. Additionally, you will troubleshoot issues related to abnormal job execution and data corruption, and automate jobs, notifications, and reports for efficiency. Your role will also involve optimizing existing queries, reverse engineering for data research and analysis, and calculating the impact of issues on downstream processes for effective communication. You will support failures, address data quality issues, and ensure the overall health of the environment. Maintaining ingestion and pipeline runbooks, portfolio summaries, and DBAR will be part of your responsibilities. Furthermore, you will enable infrastructure changes, enhancements, and updates roadmap, and build the infrastructure for optimal extraction, transformation, and loading of data from various sources using big data technologies, python, or Web-based APIs. Conducting and participating in code reviews with peers, ensuring effective communication, and understanding requirements will be essential in this role. To qualify for this position, you should hold a Bachelor's degree in Engineering/Computer Science or a related quantitative field. You must have a minimum of 8 years of programming experience with python and SQL, as well as hands-on experience with GCP, BigQuery, Dataflow, Data Warehousing, Apache Beam, and Cloud Storage. Experience with massively parallel processing systems like Spark or Hadoop, source code control systems (GIT), and CI/CD processes is required. Involvement in designing, prototyping, and delivering software solutions within the big data ecosystem, developing generative AI models, and ensuring code quality through reviews are key aspects of this role. Experience with Agile development methodologies, improving data governance and quality, and increasing data reliability are also important. Joining our team at EXL Analytics offers you the opportunity to work in a dynamic and innovative environment alongside experienced professionals. You will gain insights into various business domains, develop teamwork and time-management skills, and receive training in analytics tools and techniques. Our mentoring program and growth opportunities ensure that you have the support and guidance needed to excel in your career. Sky is the limit for our team members, and the experiences gained at EXL Analytics pave the way for personal and professional development within our company and beyond.,

Posted 1 week ago

Apply

5.0 - 9.0 years

12 - 20 Lacs

Bengaluru

Hybrid

Role & responsibilities Responsibilities Design, Develop, and maintain reports and dashboards within Power BI and Salesforce in a timely manner based on the needs of the business. Build and maintain Power BI data models, including relationships, calculated columns, and measures. Automate data refresh schedules and ensure the reports are up-to-date with the latest information. Work with data sources such as SQL databases, Excel, SharePoint, and other external APIs to gather, transform, and load data for reporting. Work tickets assigned to you via Jira through our Analytics Sprint planning process. Providing visible progress and closed-loop communication through updates on the tickets. Work with stakeholders to solve through requirements and achieve a clear understanding on requests. Managing and manipulating data and views to meet the needs of the business W orking backwards through metrics and reports to understand filters for the purpose of recreating, improving upon and/or consolidating them Help maintain our data dictionary, reporting index, and any other internal documentation used to provide visibility to our reporting stack to our internal teams Provide feedback to our Data and Operations teams on the states of our data and data sources for opportunities to improve data integrity or gain efficiencies in gathering data. Conduct training sessions or provide support to end-users on using Power BI reports effectively. Stay updated on Power BI updates, best practices, and new features to improve the efficiency and effectiveness of reporting processes. Will be working various cloud software including Salesforce, NetSuite, Hubspot and more. Requirements Bachelor's degree in a relevant field (e.g., Computer Science, Mathematics, Statistics, Business Analytics). Familiarity with writing/reading SQL Queries and PostgreSQL Database. 1 year + with using AWS ETL tools, AWS Glue, AWS S3 and AWS RDS. At least 2- 3 year of experience building reports and dashboards in Power BI, using DAX, and PowerQuery. Experience 2 year of Python. 2 years + experience with data warehousing concepts and ETL (Extract, Transform, Load) processes. Experience building and delivering business metrics and key performance indicators Experience navigating a CRM in order to review and understand business cases for data and report requests Familiar with reading/updating Entity Relationship Diagrams (ERDs) Performs ongoing monitoring and improvements of reports and BI solutions Ability to work with stakeholders to understand problems, gather requirements, present rationale, and explain technical logic to the team Must be excellent at writing and contributing to the internal documentation and knowledge bases Excellent troubleshooting and research skills Excellent listening and communications skills, both verbal and written. Action-oriented, customer-focused, with effective prioritization, goal setting and time management skills. 1 Year + Experience in using Salesforce Reporting and Salesforce Objects 1 Year + Experience in using NetSuite Reporting and datasets. Outstanding interpersonal skills; projection of professional image and credibility; teamwork oriented and inclusive. Must be authorized to work in the United States Related Certifications is a plus At least 1 year of experience in IT or managed services is preferred Experience working at or with Managed Service Providers or MSP channel products is a big plus

Posted 1 week ago

Apply

1.0 - 4.0 years

3 - 6 Lacs

Bengaluru

Work from Office

Apexon is looking for SQL Developer (Bilingual/Japanese) to join our dynamic team and embark on a rewarding career journeyDesigning and implementing database structures, including tables, indexes, views, and stored proceduresWriting and testing SQL scripts, including complex queries and transactions, to support data analysis and application developmentMaintaining and optimizing existing database systems, troubleshooting performance issues and resolving data integrity problemsCollaborating with software developers, project managers, and other stakeholders to ensure that database designs meet business requirements and technical specificationsImplementing database security and access control measures, ensuring the confidentiality and protection of sensitive dataMonitoring database performance and scalability, and making recommendations for improvements.Excellent communication and interpersonal skills

Posted 1 week ago

Apply

2.0 - 4.0 years

25 - 30 Lacs

Pune

Work from Office

Rapid7 is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 1 week ago

Apply

1.0 - 4.0 years

25 - 30 Lacs

Thane

Work from Office

Bachelor s or master s degree in computer science, Data Science, Engineering, or a related field. EsyCommerce is seeking a highly experienced Data Engineer to join our growing team in either Mumbai or Pune. This role requires a strong foundation in data engineering principles, coupled with experience in application development and data science techniques. The ideal candidate will be responsible for designing, developing, and maintaining robust data pipelines and applications, as well as leveraging analytical skills to transform data into valuable insights. This position calls for a blend of technical expertise, problem-solving abilities, and effective communication skills to drive data-driven solutions that meet business objectives.

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Job Title: Data/AI Engineer GenAI & Agentic AI Integration (Azure) Location: Bangalore, India Job Type: Full-Time About the Role We are seeking a highly skilled Data/AI Engineer to join our dynamic team, specializing in integrating cutting-edge Generative AI (GenAI) and Agentic AI solutions within the Azure cloud environment. The ideal candidate will have a strong background in Python, data engineering, and AI model integration, with hands-on experience working on Databricks, Snowflake, Azure Storage, and Palantir platforms. You will play a crucial role in designing, developing, and deploying scalable data and AI pipelines that power next-generation intelligent applications. Key Responsibilities Design, develop, and maintain robust data pipelines and AI integration solutions using Python on Azure Databricks. Integrate Generative AI and Agentic AI models into existing and new workflows to drive business innovation and automation. Collaborate with data scientists, AI researchers, software engineers, and product teams to deliver scalable and efficient AI-powered solutions. Orchestrate data movement and transformation across Azure-native services including Azure Databricks, Azure Storage (Blob, Data Lake), and Snowflake, ensuring data quality, security, and compliance. Integrate enterprise data using Palantir Foundry and leverage Azure services for end-to-end solutions. Develop and implement APIs and services to facilitate seamless AI model deployment and integration. Optimize data workflows for performance and scalability within Azure. Monitor, troubleshoot, and resolve issues related to data and AI pipeline performance. Document architecture, designs, and processes for knowledge sharing and operational excellence. Stay current with advances in GenAI, Agentic AI, Azure data engineering best practices, and cloud technologies. Required Qualifications Bachelor s or Master s degree in Computer Science, Engineering, Data Science, or a related field (or equivalent practical experience). 5+ years of professional experience in data engineering or AI engineering roles. Strong proficiency in Python for data processing, automation, and AI model integration. Hands-on experience with Azure Databricks and Spark for large-scale data engineering. Proficiency in working with Snowflake for cloud data warehousing. In-depth experience with Azure Storage solutions (Blob, Data Lake) for data ingestion and management. Familiarity with Palantir Foundry or similar enterprise data integration platforms. Demonstrated experience integrating and deploying GenAI or Agentic AI models in production environments. Knowledge of API development and integration for AI and data services. Strong problem-solving skills and ability to work in a fast-paced, collaborative environment. Excellent communication and documentation skills. Preferred Qualifications Experience with Azure Machine Learning, Azure Synapse Analytics, and other Azure AI/data services. Experience with MLOps, model monitoring, and automated deployment pipelines in Azure. Exposure to data governance, privacy, and security best practices. Experience with visualization tools and dashboard development. Knowledge of advanced AI model architectures, including LLMs and agent-based systems. #DataEngineer Job ID R-75732 Date posted 07/24/2025

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies