Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
13.0 - 17.0 years
37 - 45 Lacs
Noida
Work from Office
Involvement in solution planning Convert business specifications to technical specifications. Write clean codes and review codes of the project team members (as applicable) Adhere to Agile Delivery model. Able to solve L3 application related issues. Should be able to scale up on new technologies. Should be able document project artifacts. Technical Skills: Database and data warehouse skills, Object Oriented Programming, Design Patterns, and development knowledge. Azure Cloud experience with Cloud native development as well as migration of existing applications. Hands-on development and implementation experience in Azure Data Factory, Azure Databricks, Azure App services and Azure Service Bus. Agile development and DevSecOps understanding for end to end development life cycle is required. Experience in cutting edge OLAP cube technologies like Kyligence would be a plus Preferably worked in financial domain
Posted 1 week ago
5.0 - 10.0 years
9 - 13 Lacs
Gurugram
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals Develop, test, and support end-to-end batch and near real-time data flows/pipelines Demonstrate understanding in data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum of 5+ years of related experience required Experience in modeling and business system designs Good hands-on experience on DataStage, Cloud based ETL Services Have great expertise in writing TSQL code Well versed with data warehouse schemas and OLAP techniques Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate Must be a strong team player/leader Ability to lead Data transformation project with multiple junior data engineers Strong oral written and interpersonal skills for interacting and throughout all levels of the organization. Ability to clearly communicate complex business problems and technical solutions.
Posted 1 week ago
7.0 - 10.0 years
6 - 11 Lacs
Kolkata
Work from Office
The Senior PL/SQL SAS Mart Developer (BFSI) will be a crucial member of the data warehousing team, responsible for the design, development, and maintenance of data marts specifically tailored for the Banking, Financial Services, and Insurance (BFSI) domain. This role requires a strong combination of deep PL/SQL development skills for data transformation and manipulation within an Oracle environment, coupled with expertise in SAS for data mart creation, reporting, and analytical purposes. The successful candidate will leverage their extensive experience in the BFSI sector to build efficient, robust, and compliant data marts that support critical business intelligence, reporting, and analytical needs. Responsibilities: Data Mart Design and Development: Design and develop dimensional and multi-dimensional data marts based on business requirements within the BFSI context, utilizing both PL/SQL for backend data processing and SAS for data mart structuring and access. PL/SQL Development for Data Marts: Utilize advanced PL/SQL skills to extract, transform, and load data from the central data warehouse or source systems into the designated data marts, ensuring data quality, performance, and adherence to BFSI data standards. This includes developing complex stored procedures, functions, packages, and triggers. SAS Data Mart Implementation: Leverage SAS tools (e. g. , SAS Data Integration Studio, SAS Enterprise Guide, SAS OLAP Cube Studio) to structure, populate, and manage data marts for efficient reporting and analysis. BFSI Data Expertise: Apply a strong understanding of BFSI data models, key performance indicators (KPIs), regulatory reporting requirements (e. g. , BASEL, RBI, Solvency II), and common analytical needs within the financial services and insurance industries to design relevant and effective data marts. Performance Optimization: Optimize PL/SQL code and SAS processes related to data mart development and access to ensure high performance and efficient query execution for reporting and analytical tools. Data Quality and Governance: Implement and enforce data quality checks within both PL/SQL and SAS processes to ensure the accuracy, consistency, and integrity of data within the data marts, adhering to the banks data governance policies and BFSI-specific data quality standards. Troubleshooting and Support: Investigate and resolve issues related to data mart performance, data accuracy, and accessibility, involving both PL/SQL and SAS components. Provide expert-level support for data mart users. Documentation: Create and maintain comprehensive technical documentation for data mart designs, PL/SQL code, SAS jobs, data mappings, and user guides, ensuring compliance with BFSI documentation standards. Collaboration: Work closely with business analysts, report developers, data scientists, and other stakeholders within the BFSI departments to understand their data mart requirements and deliver solutions that meet their analytical and reporting needs. Security and Compliance: Ensure that data marts and the processes used to build and access them comply with the banks security policies and relevant BFSI regulatory requirements regarding data access and privacy. Mentoring: Provide technical guidance and mentorship to junior developers on PL/SQL and SAS skills related to data mart development within the BFSI domain. Required Skills and Experience: Bachelors degree in Computer Science, Information Technology, Statistics, Economics, Finance, or a related field. Proven experience of 7-10 years in developing data warehousing and business intelligence solutions, with a significant focus on data mart development within the Banking, Financial Services, and Insurance (BFSI) sector. Extensive and demonstrable expertise in PL/SQL development, including advanced querying, stored procedures, functions, packages, and performance tuning within an Oracle environment. Strong proficiency in SAS programming and experience using SAS tools for data mart creation and management (e. g. , SAS Data Integration Studio, SAS Enterprise Guide, SAS OLAP Cube Studio, SAS Metadata Server). Deep understanding of BFSI data models, common KPIs, and regulatory reporting requirements (e. g. , BASEL, RBI guidelines, Solvency II, IFRS). Strong SQL skills and experience working with relational databases, particularly Oracle.
Posted 1 week ago
5.0 - 10.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals Develop, test, and support end-to-end batch and near real-time data flows/pipelines Demonstrate understanding in data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum of 5+ years of related experience required Experience in modeling and business system designs Good hands-on experience on DataStage, Cloud based ETL Services Have great expertise in writing TSQL code Well versed with data warehouse schemas and OLAP techniques Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate Must be a strong team player/leader Ability to lead Data transformation project with multiple junior data engineers Strong oral written and interpersonal skills for interacting and throughout all levels of the organization. Ability to clearly communicate complex business problems and technical solutions.
Posted 1 week ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description: Power Apps & Power BI Engineer Company Overview SyrenCloud Inc. is a leading Data Engineering company that specializes in solving complex challenges in the Supply Chain Management industry. We have a team of over 350 employees and a robust revenue of $25M+. Our mission is to empower organizations with cutting-edge software engineering solutions that optimize operations, harness supply chain intelligence, and drive sustainable growth. We value both growth and employee well-being, striving to maintain a positive work environment while providing opportunities for professional development. Role Summary We are seeking a highly motivated Power Apps & Power BI Engineer to join our Digital Supply Chain team. In this role, you will focus on automating digital supply chain processes and data-driven decision-making using Power Apps and Power BI. You will be responsible for creating dynamic business applications and data visualizations, enabling business leaders, analysts, and partners to make informed decisions based on actionable insights. As a key player in the team, you will collaborate with business stakeholders to design, develop, and implement business intelligence solutions that drive operational excellence. Key Responsibilities Power BI Dashboard Creation: Develop automated reports, KPI scorecards, and visually impactful dashboards using Power BI that address key business questions. Business Requirement Analysis: Engage with stakeholders to understand business needs and translate them into functional specifications for Power Apps and Power BI solutions. Power Apps Development: Design and implement business applications in Power Apps to streamline workflows, enhance data collection, and improve user experience for supply chain processes. Data Modeling & Integration: Develop data models and integrate data from various sources, transforming raw data into actionable insights for reporting and decision-making. Data Security & Compliance: Establish row-level security in Power BI and manage application security models to ensure data integrity and confidentiality. TSQL & Power Query Development: Utilize TSQL, Power Query, MDX, and DAX to extract, manipulate, and model data for Power BI reports and Power Apps applications. Collaboration: Work closely with cross-functional teams to understand reporting requirements, design solutions, and implement dashboards and business apps that meet user needs. System Enhancement: Continuously improve and optimize the business intelligence and application systems to meet evolving business demands. Documentation & Reporting: Create clear, concise documentation for data models, reporting systems, and Power Apps applications, including descriptions of techniques, models, and relationships. Qualifications Experience: Minimum 4+ years of experience as a Power BI Developer or Power Apps Engineer, with a solid understanding of business intelligence, data modeling, and app development. Technical Proficiency: Strong proficiency in Power BI, Power Apps, SQL Server, TSQL, Power Query, MDX, and DAX. Database Knowledge: A solid understanding of relational databases, OLTP/OLAP, and multidimensional database design concepts. Problem Solving: Strong analytical skills with the ability to turn complex business requirements into effective reporting solutions and applications. Communication Skills: Excellent written and verbal communication skills, with the ability to articulate requirements and technical solutions effectively. Agile Environment: Ability to thrive in an agile development environment, managing tight deadlines and shifting priorities. Teamwork: Comfortable working with cross-functional, distributed teams to deliver solutions. What We Offer A dynamic and collaborative work environment. Opportunities for professional growth and career advancement. Competitive salary and benefits package. Exposure to exciting, impactful projects in the supply chain domain. If you’re passionate about leveraging Power Apps and Power BI to drive innovation in supply chain management, we’d love to hear from you! This job description highlights the integration of Power Apps and Power BI for driving business intelligence and application development in Syren Cloud’s Supply Chain team. The focus on automation, collaboration, and strategic data modeling ensures that both tools are leveraged effectively for business growth and innovation. Show more Show less
Posted 1 week ago
12.0 - 15.0 years
0 Lacs
India
On-site
Degree in Computer Science or related disciplines At least 12-15 years of experience in SAP BW, BI, HANA, Business Objects & Power BI, (Implementation/development experience is a must) Knowledge about the integration of Power BI, SAP Business Object tools with SAP NetWeaver BI/BW Strong experience of SAP Business Object version 4.0 or above including Flory versions. Good understanding of Current SAP Business Intelligence solutions using BEX, Analysis for Office, Webi Good knowledge of My SQL, ABAP, SQL Scripts & AMDP Good strong business process knowledge in at least 2 core SAP ECC domains BI Integrations with different tools like IBP, SAP ECC (RFC & ODP), SAP CPIDS, Open Hubs, Power BI etc. Strong Experience with development in the following tools: SAP BusinessObjects Web Intelligence Analytics OLAP Power BI Working experience in Power BI Data modelling Data visualizations Development using DAX query language Integrations with SAP eco systems Oversee all phases of the Software Development Life Cycle (SDLC), ensuring successful end-to-end delivery. Core Competencies At least 12-15 years of experience in SAP BI design & development based on best practices BW Modelling Native Hana Mixed modelling scenarios BOBJ – AFO, Webi ABAP, AMDP’s, HANA Scripts Logistics, Finance, Sales, Procurement experience Power BI BW4HANA will be added advantage. Must have team management experience to handle 4-6 resources across multiple locations Experienced working in different delivery models i.e., onsite, offshore etc. Ability to provide accurate reporting on periodic basis to key stakeholders Ability & desire to take ownership of BI projects, lead design, build & support. Collaborate closely with Business teams to ensure successful delivery Flexible & adaptable; able to work with tight deadlines Willing to take up new challenges/areas/modules & technological disruptions Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
Hyderābād
On-site
At Apple, new ideas quickly transform into groundbreaking products, services, and customer experiences. Bring passion and dedication to your work, and there’s no telling what can be accomplished. As part of the Supply Chain Innovation team, you will play a pivotal role in building end-to-end, best-in-class software solutions for Apple’s Supply Chain needs, ranging from Supply Planning and Demand Planning to Product Distribution and beyond. You will collaborate with various internal partners to define and implement solutions that optimize Apple’s internal business processes. Description We are seeking an individual who can address challenges and find creative solutions. The ideal candidate should excel in collaborative environments and produce high-quality software under tight deadlines. Must be a self-starter, highly motivated, and able to work independently, while collaborating effectively with multi-functional teams across the globe (US, Singapore, India, and Europe). This role will have a direct impact on Apple’s business, requiring interaction with various internal teams to deliver cutting-edge products in a dynamic, ever-evolving environment. Key Responsibilities: Design, develop, and optimize highly scalable, distributed systems, leveraging cloud-native technologies and micro services architecture to build robust and resilient solutions. Lead proof-of-concept projects and pilot implementations to showcase new ideas. Strive for excellence by continuously seeking ways to enhance system reliability, performance, and security. Contribute to design and code reviews, and assist in debugging and resolving issues. Develop system components and take full responsibility for the timely delivery, quality of the work. Collaborate with product owners, developers, QA, support teams, and end users. Mentor and guide a team of engineers, fostering a culture of innovation and excellence. Tackle complex technical challenges, drive innovation, and stay up-to-date with emerging technologies. Collaborate with contractors to ensure successful project execution. Ability to multitask and work independently with minimal supervision. Occasionally, will need to handle application production (warranty) support. Excellent verbal and written communication skills. Minimum Qualifications 6+ years of relevant experience in enterprise-level application development using advanced Oracle database technologies Hands-on experience with large-volume databases for both OLTP and OLAP environments Experience with databases such as SingleStore, Oracle, Snowflake, NoSQL, Graph, etc. Strong expertise in ETL, performance optimization, and maintenance of various solutions based on Oracle databases Strong ability to research, design, and develop complex technical solutions involving multiple technologies Bachelor's / Master’s degree in Computer Science or equivalent Preferred Qualifications Familiarity with Agile project management methodologies Experience with Python, Pandas DataFrames, SQLAlchemy, numpy, etc Experience in consuming and exposing web services (e.g., SOAP, REST) Knowledge of UNIX/Linux platforms and scripting experience with Shell, XML, JSON AI/ML-related experience A strong understanding of LLMs, prompt engineering and RAG Experience in developing applications for the Supply Chain business domain Submit CV
Posted 1 week ago
0 years
6 - 10 Lacs
Gurgaon
On-site
Ready to build the future with AI? At Genpact, we don’t just keep up with technology—we set the pace. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory, our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what’s possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Senior Principal Consultant, Automation Test Lead! Responsibilities Understand the need of the requirement beyond its face value, design a proper machine-executable automation solution using Python scripts. you will be getting the requirement of Business Rules or Automation test scenario from business or QA team to Automate using Python and SQL, you will not be responsible for writing test case. Implement the re-useable solution following best practice, and delivery the automation results on time. Maintaining, troubleshooting, and optimise existing solution Collaborate with various disciplinary teams to align automation solution to boarder engineering community. Documentation. Lead, coordinate and guide the ETL Manual and automation testers. You may get a change to learn new technologies as well on cloud. Tech Stack (as of now) 1. Redshift 2. Aurora (postgresql) 3. S3 object storage 4. EKS / ECR 5. SQS/SNS 6. Roles/Policies 7. Argo 8. Robot Framework 9. Nested JSON Qualifications we seek in you! Minimum Qualifications 1. Python scripting. Candidate should be strong on python programming design / Pandas / processes / http requests like protocols 2. SQL technologies. (best in postgresql ) : OLTP/ OLAP / Join/Group/aggregation/windows functions etc. 3. Windows / Linux Operation systems basic command knowledge 4. Git usage. understand version control systems, concepts like git branch/pull request/ commit / rebase/ merge 6. SQL Optimization knowledge is plus 7. Good understand and experience in data structure related work. Preferred Qualifications Good to Have as Python code to be deploy using these framework 1. Docker is a plus. understanding about the images/container concepts. 2. Kubernetes is a plus. understanding the concepts and theory of the k8s, especially pods / env etc. 3. Argo workflow / airflow is a plus. 4. Robot Framework is a plus. 5. Kafka is a plus. understand the concept for kafka, and event driven method. Why join Genpact? Lead AI-first transformation – Build and scale AI solutions that redefine industries Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career—Gain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best – Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI – Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Senior Principal Consultant Primary Location India-Gurugram Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 9, 2025, 3:51:48 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time
Posted 1 week ago
3.0 - 6.0 years
3 - 4 Lacs
Bengaluru
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description (Shaded areas for Talent use only) We are seeking a passionate data analyst to transform data into actionable insights and support decision-making in a global organization focused on pricing and commercial strategy. This role spans business analysis, requirements gathering, data modeling, solution design, and visualization using modern tools. The analyst will also maintain and improve existing analytics solutions, interpret complex datasets, and communicate findings clearly to both technical and non-technical audiences. Essential Functions of the Job: (Identify and describe essential functions, or primary duties and responsibilities. Each function should describe WHAT is done and the END RESULT/PURPOSE achieved. Assume that the reader does not know the role or function of the job.) Analyze and interpret structured and unstructured data using statistical and quantitative methods to generate actionable insights and ongoing reports. Design and implement data pipelines and processes for data cleaning, transformation, modeling, and visualization using tools such as Power BI, SQL, and Python. Collaborate with stakeholders to define requirments, prioritize business needs, and translate problems into analytical solutions. Develop, maintain, and enhance scalable analytics solutions and dashboards that support pricing strategy and commercial decision-making. Identify opportunities for process improvement and opperational efficiency through data-driven recommendations. Communicate complex findings in a clear, compelling, and actionable manner to both technical and non-technical audiences. Analytical/Decision Making Responsibilities: (Describe the kind of problems and challenges typically faced, and decisions required to perform the job, as well as recommendations made to supervisors or others. Focus on the nature of existing policies, precedents and procedures used to guide decisions, and the degree to which the incumbent is free to make decisions requiring interpretation and judgment. Provide an example.) Apply a hypothesis-driven approach to analyzing ambiguous or complex data and synthesizing insights to guide strategic decisions. Promote adoption of best practices in data analysis, modeling, and visualization, while tailoring approaches to meet the unique needs of each project. Tackle analytical challenges with creativity and rigor, balancing innovative thinking with practical problem-solving across varied business domains. Prioritize work based on business impact and deliver timely, high-quality results in fast-paced environments with evolving business needs. Demonstrate sound judgement in selecting methods, tools, and data sources to support business objectives. Knowledge and Skills Requirements: (Describe the knowledge or skills needed to perform this job; these may be professional, technical, or managerial) Proven experience as a data analyst, business analyst, data engineer, or similar role. Strong analytical skills with the ability to collect, organize, analyze, and present large datasets accurately. Foundational knowledge of statistics, including concepts like distributions, variance, and correlation. Skilled in documenting processes and presenting findings to both technical and non-technical audiences. Hands-on experience with Power BI for designing, developing, and maintaining analytics solutions. Proficient in both Python and SQL, with strong programming and scripting skills. Skilled in using Pandas, T-SQL, and Power Query M for querying, transforming, and cleaning data. Hands-on experience in data modeling for both transactional (OLTP) and analytical (OLAP) database systems. Strong visualization skills using Power BI and Python libraries such as Matplotlib and Seaborn. Experience with defining and designing KPIs and aligning data insights with business goals. Additional/Optional Knowledge and Skills: (Describe any additional knowledge or skills that, while not required, may be useful or helpful to perform this job; these may be professional, technical, or managerial) Experience with the Microsoft Fabric data analytics environment. Proficiency in using the Apache Spark distributed analytics engine, particularly via PySpark and Spark SQL. Exposure to implementing machine learning or AI solutions in a business context. Familiarity with Python machine learning libraries such as scikit-learn, XGBoost, PyTorch, or transformers. Experience with Power Platform tools (Power Apps, Power Automate, Dataverse, Copilot Studio, AI Builder). Knowledge of pricing, commercial strategy, or competitive intelligence. Experience with cloud-based data services, particularly in the Azure ecosystem (e.g., Azure Synapse Analytics or Azure Machine Learning). Supervision Responsibilities: (Describe the level of supervision received, i.e., the frequency of supervisory contact, degree to which the individual acts independently and on what kinds of issues. Describe the level of supervision of others, if any, i.e., assigning work, reviewing performance, direct or indirect responsibility). Operates with a high degree of independence and autonomy. Collaborates closesly with cross-functional teams including sales, pricing, and commercial strategy. Mentors junior team members, helping develop technical skills and business domain knowledge. Other Requirements: (Describe any miscellaneous functions or expectations of the job that are important to note) Collaborates with a team operating primarily in the Eastern Time Zone (UTC 4:00 / 5:00). Limited travel may be required for this role. Job Requirements: Education: (What is the minimum level of education or equivalent experience needed/suggested to perform this job effectively?) A bachelor’s degree in a STEM field relevant to data analysis, data engineering, or data science is required. Examples include (but are not limited to) computer science, statistics, data analytics, artificial intelligence, operations research, or econometrics. Experience: (What is the minimum number or range of years needed to perform this job?) 3–6 years of experience in data analysis, data engineering, or a closely related field, ideally within a professional services enviornment. Certification Requirements: (Describe and explain any certifications and/or licenses needed or helpful to perform this job). No certifications are required for this role. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
2.0 - 9.0 years
4 - 5 Lacs
Coimbatore
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY_ Consulting _Data Analytics Testing_Staff The opportunity As a Data Analytics Test Engineer, you will be responsible for testing Business Intelligence & Data warehousing Solutions both in on premise and cloud platform and should ensure Quality of deliverables. You will work closely with Test Lead for the Projects under Test. Testing proficiency in ETL, data-warehousing, and Business Intelligence area are required for this position. Added advantage to have experience in testing Big Data/unstructured data using Hadoop/Spark framework, cloud platform knowledge either in AWS/Azure, knowledge in predictive analytics, Machine Learning and Artificial intelligence. Skills and attributes for success Delivery of Testing needs for BI & DWH Projects. Ability to effectively communicate with team members across geographies effectively Perform unstructured data / big data testing both in on-premise and cloud platform. Thorough understanding of Requirements and provide feedback on the requirements. Develop Test Strategy for Testing BI & DWH Projects for various aspects like ETL testing & Reports testing (Front end and Backend Testing), Integration Testing and UAT as needed. Provide inputs for Test Planning aligned with Test Strategy. Perform Test Case design, identify opportunity for Test Automation. Develop Test Cases both Manual and Automation Scripts as required. Ensure Test readiness (Test Environment, Test Data, Tools Licenses etc) Perform Test execution and report the progress. Report defects and liaise with development & other relevant team for defect resolution. Prepare Test Report and provide inputs to Test Lead for Test Sign off/ Closure Provide support in Project meetings/ calls with Client for status reporting. Provide inputs on Test Metrics to Test Lead. Support in Analysis of Metric trends and implementing improvement actions as necessary. Handling changes and conducting Regression Testing Generate Test Summary Reports Co-coordinating Test team members and Development team Interacting with client-side people to solve issues and update status Actively take part in providing Analytics and Advanced Analytics Testing trainings in the company To qualify for the role, you must have BE/BTech/MCA/M.Sc Overall 2 to 9 years of experience in Testing Data warehousing / Business Intelligence solutions, minimum 2 years of experience in Testing BI & DWH technologies and Analytics applications. Experience in Bigdata testing with Hadoop/Spark framework and exposure to predictive analytics testing. Very good understanding of business intelligence concepts, architecture & building blocks in areas ETL processing, Datawarehouse, dashboards and analytics. Experience in cloud AWS/Azure infrastructure testing is desirable. Knowledge on python data processing is desirable. Testing experience in more than one of these areas- Data Quality, ETL, OLAP, Reports Good working experience with SQL server or Oracle database and proficiency with SQL scripting. Experience in backend Testing of Enterprise Applications/ Systems built on different platforms including Microsoft .Net and Sharepoint technologies Experience in ETL Testing using commercial ETL tools is desirable. Knowledge/ experience in SSRS, Spotfire (SQL Server Reporting Services) and SSIS is desirable. Experience/ Knowledge in Data Transformation Projects, database design concepts & white-box testing is desirable. Ideally, you’ll also have Able to contribute as an individual contributor and when required Lead a small Team Able to create Test Strategy & Test Plan for Testing BI & DWH applications/ solutions that are moderate to complex / high risk Systems Design Test Cases, Test Data and perform Test Execution & Reporting. Should be able to perform Test Management for small Projects as and when required Participate in Defect Triaging and track the defects for resolution/ conclusion Experience/ exposure to Test Automation and scripting experience in perl & shell is desirable Experience with Test Management and Defect Management tools preferably HP ALM Good communication skills (both written & verbal) Good understanding of SDLC, test process in particular Good analytical & problem solving or troubleshooting skills Good understanding of Project Life Cycle and Test Life Cycle. Exposure to CMMi and Process improvement Frameworks is a plus. Should have excellent communication skills & should be able to articulate concisely & clearly Should be ready to do an individual contributor as well as Team Leader role What working at EY offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
6.0 years
3 - 6 Lacs
Calcutta
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – Senior - IICS Developer EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity A Senior Designer and Developer working with Informatica Intelligent Cloud Services (IICS) in roles involving multiple sources such as files and tables typically has a broad set of responsibilities centered around designing, developing, and managing complex data integration workflows. Their role spans across multiple data sources, including databases, files, cloud storage, and APIs, to ensure seamless data movement and transformation for analytics and business intelligence purposes. Key Roles and Responsibilities of an IICS Senior Designer and Developer Designing and Developing Data Integration Solutions Develop and design ETL (Extract, Transform, Load) mappings and workflows using Informatica Cloud IICS, integrating data from various sources such as files, multiple database tables, cloud storage, and APIs through ODBC and REST connectors. Configure synchronization tasks that may involve multiple database tables as sources, ensuring efficient data extraction and loading. Build reusable, parameterized mapping templates to handle different data loads including full, incremental, and CDC (Change Data Capture) loads. Handling Multiple Data Sources Work with structured, semi-structured, and unstructured data sources including Oracle, SQL Server MI, Azure Data Lake, Azure Blob Storage, Sales Force Net zero, Snowflake, and other cloud/on-premises platforms. Manage file ingestion tasks to load large datasets from on-premises systems to cloud data lakes or warehouses. Use various cloud connectors and transformations (e.g., Aggregator, Filter, Joiner, Lookup, Rank, Router) to process and transform data efficiently. Data Quality, Governance, and Documentation Implement data quality and governance policies to ensure data accuracy, integrity, and security throughout the data integration lifecycle. Create and maintain detailed documentation such as source-to-target mappings, ETL design specifications, and data migration strategies. Develop audit frameworks to track data loads and support compliance requirements like SOX. Project Planning and Coordination Plan and monitor ETL development projects, coordinate with cross-functional teams including system administrators, DBAs, data architects, and analysts to align on requirements and deliverables. Communicate effectively across organizational levels to report progress, troubleshoot issues, and coordinate deployments. Performance Tuning and Troubleshooting Optimize ETL workflows and mappings for performance, including tuning SQL/PLSQL queries and Informatica transformations. Troubleshoot issues using IICS frameworks and collaborate with Informatica support as needed. Leadership and Mentoring (Senior Role Specific) Oversee design and development efforts, review work of junior developers, and ensure adherence to best practices and standards. Lead the creation of ETL standards, naming conventions, and methodologies to promote consistency and reusability across projects. Summary of Skills and Tools Commonly Used Informatica Intelligent Cloud Services (IICS), Informatica Cloud Data Integration (CDI) Should be having 6-9 years of experience SQL MI, PL/SQL, API integrations (REST V2), ODBC connections, Flat Files , ADLS, Sales Force Netzero Cloud platforms: Azure Data Lake, Azure Synapse (SQL Data Warehouse), Snowflake, AWS Redshift Data modelling and warehousing concepts including OLAP, Star and Snowflake schemas Data quality tools and scripting languages such as Python, R, or SAS for advanced analytics support Project management and documentation tools, strong communication skills In essence, a Senior IICS Designer and Developer role is a blend of technical expertise in data integration across multiple heterogeneous sources (files, tables, APIs), project leadership, and ensuring high-quality, scalable data pipelines that support enterprise BI and analytics initiatives. What we look for A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What working at EY offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Greetings from TCS!!! TCS has been a great pioneer in feeding the fire of young Techies like you. We are a global leader in the technology arena and there’s nothing that can stop us from growing together. Your role is of key importance, as it lays down the foundation for the entire project. Make sure you have a valid EP number before interview. To create an EP Number, please visit https://ibegin.tcs.com/iBegin/register Kindly complete the registration if you have not done it yet. Position: Snowflake Developer Job Location: Chennai Experience: 4 + years Job Title: Snowflake Developer Desired Competencies (Technical/Behavioral Competency) Must-Have** Snowflake SQL Writing SQL queries against Snowflake Developing scripts Unix, Python, etc. to do Extract, Load, and Transform data . Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. . In-depth understanding of Data Warehouse/ODS, ETL concept and modeling structure principles . Experience in Data warehousing - OLTP, OLAP, Dimensions, Facts, and Data modeling. . Experience gathering and analyzing system requirements . Good working knowledge of any ETL tool (Informatica or SSIS) Good-to-Have Good to have familiarity with data visualization tools (Tableau/Power BI) Good to have exposure to AWS / Azure Data ecosystem TCS Eligibility Criteria: *BE/B.tech/MCA/M.Sc./MS with minimum 3 years of relevant IT-experience post Qualification. *Only Full-Time courses would be considered. *Candidates who have attended TCS interview in the last 1 months need not apply. Referrals are always welcome!!! Kindly don't apply if already attended interview in last 1 months. Please apply only if you are interested to attend the Walk-in Thanks & Regards Parvathy Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Position - Technical Lead Location - Bangalore/Pune/Hyderabad/Gurugram/Kolkata/Chennai/Mumbai Experience - 8+ Years ABOUT HASHEDIN We are software engineers who solve business problems with a Product Mindset for leading global organizations. By combining engineering talent with business insight, we build software and products that can create new enterprise value. The secret to our success is a fast-paced learning environment, an extreme ownership spirit, and a fun culture. WHY SHOULD YOU JOIN US? With the agility of a start-up and the opportunities of an enterprise , every day at HashedIn, your work will make an impact that matters. So, if you are a problem solver looking to thrive in a dynamic fun culture of inclusion, collaboration, and high performance – HashedIn is the place to be! From learning to leadership, this is your chance to take your software engineering career to the next level. So, what impact will you make? Visit us @ https://hashedin.com JOB TITLE: Data Integration Tech Lead (Oracle ODI) We are seeking an energetic and technically proficient Data Integration Tech Lead to design, build, and optimize robust data integration and analytics solutions using the Oracle technology stack. This role puts you at the core of our enterprise data modernization efforts, responsible for designing, implementing, and maintaining end-to-end data integration pipelines across traditional and cloud platforms. You will leverage your expertise in Oracle Data Integrator (ODI), Oracle Integration Cloud (OIC), and related technologies to drive efficient data movement, transformation, and loading while maintaining the highest standards of data quality, lineage, and governance. You will work hands-on and lead a small team of developers, shaping best practices for data integration workflows and collaborating with Analytics/BI teams to deliver fit-for-purpose solutions. Mandatory Skills: Experience: 6–8 years of progressive experience in enterprise data integration, with at least 4 years hands-on experience in Oracle Data Integrator (ODI). Strong understanding and working experience with Oracle Integration Cloud (OIC), Oracle databases, and related cloud infrastructure. Proven track record in designing and implementing large-scale ETL/ELT solutions across hybrid (on-prem/cloud) architectures. Technical Proficiency: Deep hands-on expertise with ODI components (Topology, Designer, Operator, Agent) and OIC (Integration patterns, adapters, process automation). Strong command of SQL and PL/SQL for data manipulation and transformation. Experience with REST/SOAP APIs, batch scheduling, and scripting (Python, Shell, or similar) for process automation. Data modeling proficiency (logical/physical, dimensional, OLAP/OLTP). Familiarity with Oracle Analytics Cloud (OAC), OBIEE, and integration into analytics platforms. Solid understanding of data quality frameworks, metadata management, and lineage documentation. Setting up topology, building objects in Designer, Monitoring Operator, different type of KMs, Agents etc Packaging components, database operations like Aggregate pivot, union etc. Using ODI mappings, error handling, automation using ODI, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Expertise in developing Load Plans, Scheduling Jobs Ability to design data quality and reconciliation framework using ODI Integrate ODI with multiple Source / Target Show more Show less
Posted 1 week ago
4.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Dear Associate Greetings from TATA Consultancy Services!! Thank you for expressing your interest in exploring a career possibility with the TCS Family. We have a job opportunity for ETL Test Engineer at Tata Consultancy Services. Hiring For: ETL Test Engineer Interview date: 14-June-25 In-person Drive Location: Bangalore Experience: 4-10 years Must Have: 1. SQL - Expert level of knowledge in core concepts of SQL and query. 2.Lead and mentor a team of ETL testers, providing technical guidance, training, and support in ETL tools, SQL, and test automation frameworks. 3.Create and review complex test cases, test scripts, and test data for ETL processes. 4. ETL Automation - Experience in Datagap, Good to have experience in tools like Informatica, Talend and Ab initio. 5.Execute test cases, validate data transformations, and ensure data accuracy and consistency across source and target systems 6. Experience in query optimization, stored procedures/views and functions. 7. Strong familiarity of data warehouse projects and data modeling. 8. Understanding of BI concepts - OLAP vs OLTP and deploying the applications on cloud servers. 9. Preferably good understanding of Design, Development, and enhancement of SQL server DW using tools (SSIS,SSMS, Power BI/Cognos/Informatica, etc.) 10.Develop and maintain ETL test automation frameworks to enhance testing efficiency and coverage. 11. Integrate automated tests into the CI/CD pipeline to ensure continuous validation of ETL processes. 12. Azure DevOps/JIRA - Hands on experience on any test management tools preferably ADO or JIRA. 13. Agile concepts - Good experience in understanding agile methodology (scrum, lean etc.) 14. Communication - Good communication skills to understand and collaborate with all the stake holders within the project If you are interested in this exciting opportunity, please share your updated resume on saranya.devi3@tcs.com along with the additional information mentioned below: Name: Preferred Location: Contact No: Email id: Highest Qualification: University/Institute name: Current Organization Willing to relocate Bangalore : Total Experience: Relevant Experience in (ETL Test Engineer): Current CTC: Expected CTC: Notice Period: Gap Duration: Gap Details: Available for In-Person interview on 14-June-25: Timings: Attended interview with TCS in past(details): Please share your I begin portal EP id if already registered: Note: only Eligible candidates with Relevant experience will be contacted further. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Role Summary The AI, Data, and Analytics (AIDA) organization team, a Pfizer Digital organization, is responsible for the development and management of all data and analytics tools and platforms across the enterprise – from global product development, to manufacturing, to commercial, to point of patient care across over 100+ countries. One of the team’s top priorities is the development of Business Intelligence (BI), Reporting, and Visualization products which will serve as an enabler for the company’s digital transformation to bring innovative therapeutics to patients. We are looking for a technically skilled and experienced Reporting Engineering Manager who is passionate about developing BI and data visualization products for our Customer Facing and Sales Enablement Colleagues, totaling over 20,000 individuals. This role involves working across multiple business segments globally to deliver top-tier BI Reporting and Visualization capabilities that enable impactful business decisions and high engagement user experiences. This role will work across multiple business segments globally to deliver best in class BI Reporting and Visualization capabilities that enable impactful business decisions and cohesive high engagement user experiences. In this position, you will be accountable to have a thorough understanding of data, business, and analytic requirements to deliver high-impact, relevant interactive data visualizations products that drive company performance through continuously monitoring, measuring, identifying root cause, and proactively identifying patterns and triggers across the company to optimize performance. This role will also drive best practices and standards for BI & Visualization. This role will work closely with stakeholders to understand their needs and ensure that reporting assets are created with a focus on Customer Experience. This role requires working with complex and advanced data environments, employing the right architecture to build scalable semantic layers and contemporary reporting visualizations. The Reporting Manager will ensure data quality and integrity by validating the accuracy of KPIs and insights, resolving anomalies, implementing data quality checks, and conducting system integration testing (SIT) and user acceptance testing (UAT). The ideal candidate is a passionate and results-oriented product lead with a proven track record of delivering data and analytics driven solutions for the pharmaceutical industry. Role Responsibilities Engineering expert in business intelligence and data visualization products in service of field force and HQ enabling functions. Act as a Technical BI & Visualization developer on projects and collaborate with global team members (e.g. other engineers, regional delivery and activation teams, vendors) to architect, design and create BI & Visualization products at scale. Thorough understanding of data, business, and analytic requirements (incl. BI Product Blueprints such as SMART) to deliver high-impact, relevant data visualizations products while respecting project or program budgets and timelines. Deliver quality Functional Requirements and Solution Design, adhering to established standards and best practices. Follow Pfizer Process in Portfolio Management, Project Management, Product Management Playbook following Agile, Hybrid or Enterprise Solution Life Cycle. Extensive technical and implementation knowledge of multitude of BI and Visualization platforms not limiting to Tableau, MicroStrategy, Business Objects, MS-SSRS, and etc. Experience of cloud-based architectures, cloud analytics products / solutions, and data products / solutions (eg: AWS Redshift, MS SQL, Snowflake, Oracle, Teradata). Qualifications Bachelor’s degree in a technical area such as computer science, engineering, or management information science. Recent Healthcare Life Sciences (pharma preferred) and/or commercial/marketing data experience is highly preferred. Domain knowledge in the pharmaceutical industry preferred. Good knowledge of data governance and data cataloging best practices. Relevant experience or knowledge in areas such as database management, data quality, master data management, metadata management, performance tuning, collaboration, and business process management. Strong Business Analysis acumen to meet or exceed business requirements following User Center Design (UCD). Strong Experience with testing of BI and Analytics applications – Unit Testing (e.g. Phased or Agile Sprints or MVP), System Integration Testing (SIT) and User Integration Testing (UAT). Experience with technical solution management tools such as JIRA or Github. Stay abreast of customer, industry, and technology trends with enterprise Business Intelligence (BI) and visualization tools. Technical Skillset 5+ years of hands-on experience in developing BI capabilities using Microstrategy Proficiency in common BI tools, such as Tableau, PowerBI, etc.. is a plus. Common Data Model (Logical & Physical), Conceptual Data Model validation to create Consumption Layer for Reporting (Dimensional Model, Semantic Layer, Direct Database Aggregates or OLAP Cubes) Develop using Design System for Reporting as well as Adhoc Analytics Template BI Product Scalability, Performance-tuning Platform Admin and Security, BI Platform tenant (licensing, capacity, vendor access, vulnerability testing) Experience in working with cloud native SQL and NoSQL database platforms. Snowflake experience is desirable. Experience in AWS services EC2, EMR, RDS, Spark is preferred. Solid understanding of Scrum/Agile is preferred and working knowledge of CI/CD, GitHub MLflow. Familiarity with data privacy standards, governance principles, data protection, pharma industry practices/GDPR compliance is preferred. Great communication skills. Great business influencing and stakeholder management skills. Pfizer is an equal opportunity employer and complies with all applicable equal employment opportunity legislation in each jurisdiction in which it operates. Information & Business Tech Show more Show less
Posted 1 week ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Hi {fullName} There is an opportunity for AZURE DATA ENGINEER AT Bangalore for which WALKIN interview is there on 14th Jun 25 between 9:00 AM TO 12:30 PM PLS SHARE below details to mamidi.p@tcs.com with subject line as AZURE DATA ENGINEER 14th Jun 25 if you are interested Email id: Contact no: Total EXP: Preferred Location: CURRENT CTC: EXPECTED CTC: NOTICE PERIOD: CURRENT ORGANIZATION: HIGHEST QUALIFICATION THAT IS FULL TIME : HIGHEST QUALIFICATION UNIVERSITY: ANY GAP IN EDUCATION OR EMPLOYMENT: IF YES HOW MANY YEARS AND REASON FOR GAP: ARE U AVAILABLE FOR WALKIN INTERVIEW AT BANGALORE ON 14TH JUN 25(YES/NO): We will share a mail to you by tom Night if you are shortlisted. 1 Role** Azure Data Engineer Databricks 2 Required Technical Skill Set** Azure SQL, Azure SQL DW, Azure Data Lake Store , Azure Data Factory Must have Implementation, and operations of OLTP, OLAP, DW technologies such as Azure SQL, Azure SQL DW, , Azure Data Lake Store , Azure Data Factory and understanding of Microsoft Azure PaaS features. Azure Cloud ,Azure Databricks, Data Factory knowledge are good to have, otherwise any cloud exposure Ability to gather requirements from client side and explain to tech team members. Resolve conflicts in terms of bandwidth or design issues. Good understanding of data modeling, data analysis, data governance. Very good communication skills and client handling skills Show more Show less
Posted 1 week ago
0 years
0 Lacs
India
On-site
Job Duties and Responsibilities: 1. Data Architecture Design Design scalable and secure enterprise data architectures for transactional systems, data warehouses, and data lakes. Define data flows, integration strategies, and architecture blueprints to support business intelligence and analytics platforms. 2. Data Modeling Develop logical, physical, and conceptual data models to represent complex business requirements. Create and maintain dimensional models (star/snowflake schemas) and normalized data structures for OLTP and OLAP systems. Use data modeling tools such as ER/Studio, ERwin, or Lucidchart . 3. Data Governance & Quality Define data standards, naming conventions , and metadata strategies to ensure consistency across systems. Collaborate with data stewards to enforce data quality rules , validations, and lineage tracking. Recommend technologies and platforms for data storage, integration, and processing (e.g., Snowflake, Azure Synapse, Redshift, Databricks). Guide data platform modernization and cloud migration strategies, ensuring alignment with enterprise goals. Show more Show less
Posted 1 week ago
4.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Role: ETL Test Engineer Experience range: 4-10 years Location: Current location must be Bangalore ONLY NOTE: Candidate interested for a walk-in drive in Bangalore must apply Job description: 1.Min 4 to 6 yrs of Exp in ETL Testing. 2.SQL - Expert level of knowledge in core concepts of SQL and query. 3. ETL Automation - Experience in Datagap, Good to have experience in tools like Informatica, Talend and Ab initio. 4. Experience in query optimization, stored procedures/views and functions. 5.Strong familiarity of data warehouse projects and data modeling. 6. Understanding of BI concepts - OLAP vs OLTP and deploying the applications on cloud servers. 7.Preferably good understanding of Design, Development, and enhancement of SQL server DW using tools (SSIS,SSMS, PowerBI/Cognos/Informatica, etc.) 8. Azure DevOps/JIRA - Hands on experience on any test management tools preferably ADO or JIRA. 9. Agile concepts - Good experience in understanding agile methodology (scrum, lean etc.) 10.Communication - Good communication skills to understand and collaborate with all the stake holders within the project Show more Show less
Posted 1 week ago
0.0 - 10.0 years
0 Lacs
Indore, Madhya Pradesh
On-site
Indore, Madhya Pradesh, India;Noida, Uttar Pradesh, India Qualification : Job Description: We are looking for GCP Data Engineer and SQL Programmer with good working experience on PostgreSQL, & PL/SQL programming experience and following technical skills PL/SQL and PostgreSQL programming – Ability to write complex SQL Queries, Stored Procedures. Migration – Working experience in migrating Database structure and data from Oracle to Postgres SQL preferably on GCP Alloy DB or Cloud SQL Working experience on Cloud SQL/Alloy DB Working experience to tune autovacuum in postgresql. Working experience on tuning Alloy DB / PostgreSQL for better performance. Working experience on Big Query, Fire Store, Memory Store, Spanner and bare metal setup for PostgreSQL Ability to tune the Alloy DB / Cloud SQL database for better performance Experience on GCP Data migration service Working experience on MongoDB Working experience on Cloud Dataflow Working experience on Database Disaster Recovery Working experience on Database Job scheduling Working experience on Database logging techniques Knowledge of OLTP And OLAP Desirable: GCP Database Engineer Certification Other Skills:- Out of the Box Thinking Problem Solving Skills Ability to make tech choices (build v/s buy) Performance management (profiling, benchmarking, testing, fixing) Enterprise Architecture Project management/Delivery Capabilty/ Quality Mindset Scope management Plan (phasing, critical path, risk identification) Schedule management / Estimations Leadership skills Other Soft Skills Learning ability Innovative / Initiative Skills Required : Postgresql, plsql, Bigquery Role : Develop, construct, test, and maintain data architectures Migrate Enterprise Oracle database from On Prem to GCP cloud autovacuum in postgresql Ability to tune autovacuum in postgresql. Working on tuning Alloy DB / PostgreSQL for better performance. Performance Tuning of PostgreSQL stored procedure code and queries Converting Oracle stored procedure & queries to PostgreSQL stored procedures & Queries Creating Hybrid data store with Datawarehouse and No SQL GCP solutions along with PostgreSQL. Migrate Oracle Table data from Oracle to Alloy DB Leading the database team Experience : 7 to 10 years Job Reference Number : 12779
Posted 1 week ago
0.0 - 6.0 years
0 Lacs
Indore, Madhya Pradesh
On-site
Indore, Madhya Pradesh, India;Bangalore, Karnataka, India;Noida, Uttar Pradesh, India Qualification : Pre-Sales Solution Engineer - India Experience areas or Skills : Pre-Sales experience of Software or analytics products Excellent verbal & written communication skills OLAP tools or Microsoft Analysis services (MSAS) Data engineering or Data warehouse or ETL Hadoop ecosystem or AWS, Azure or GCP Cluster and processing Tableau or Micro strategy or any BI tool Hive QL or Spark SQL or PLSQL or TSQL Writing and troubleshooting SQL programming or MDX queries Working on Linux programming in Python, Java or Java Script would be a plus Filling RFP or Questioner from Customer NDA, Success Criteria, Project closure and other Documentation Be willing to travel or relocate as per requirement Role : Acts as main point of contact for Customer contacts involved in the evaluation process Product demonstrations to qualified leads Product demonstrations in support of marketing activity such as events or webinars Own RFP, NDA, PoC success criteria document, POC Closure and other documents Secures alignment on Process and documents with the customer / prospect Owns the technical win phases of all active opportunities Understand Customer domain and database schema Providing OLAP and Reporting solution Work closely with customers for understanding and resolving environment or OLAP cube or reporting related issues Co-ordinate with solutioning team for execution of PoC as per success plan Creates enhancement requests or identify requests for new features on behalf of customers or hot prospects Experience : 3 to 6 years Job Reference Number : 10771
Posted 1 week ago
0.0 - 10.0 years
0 Lacs
Noida, Uttar Pradesh
On-site
Noida, Uttar Pradesh, India;Indore, Madhya Pradesh, India;Bangalore, Karnataka, India;Hyderabad, Telangana, India;Gurgaon, Haryana, India Qualification : Required Proven hands-on experience on designing, developing and supporting Database projects for analysis in a demanding environment. Proficient in database design techniques – relational and dimension designs Experience and a strong understanding of business analysis techniques used. High proficiency in the use of SQL or MDX queries. Ability to manage multiple maintenance, enhancement and project related tasks. Ability to work independently on multiple assignments and to work collaboratively within a team is required. Strong communication skills with both internal team members and external business stakeholders Added Advanatage Hadoop ecosystem or AWS, Azure or GCP Cluster and processing Experience working on Hive or Spark SQL or Redshift or Snowflake will be an added advantage. Experience of working on Linux system Experience of Tableau or Micro strategy or Power BI or any BI tools will be an added advantage. Expertise of programming in Python, Java or Shell Script would be a plus Role : Roles & Responsibilities Be frontend person of the world’s most scalable OLAP product company – Kyvos Insights. Interact with senior-most technical and business people of large enterprises to understand their big data strategy and their problem statements in that area. Create, present, align customers with and implement solutions around Kyvos products for the most challenging enterprise BI/DW problems. Be the Go-To person for customers regarding technical issues during the project. Be instrumental in reading the pulse of the big data market and defining the roadmap of the product. Lead a few small but highly efficient teams of Big data engineers Efficient task status reporting to stakeholders and customer. Good verbal & written communication skills Be willing to work on off hours to meet timeline. Be willing to travel or relocate as per project requirement Experience : 5 to 10 years Job Reference Number : 11078
Posted 1 week ago
12.0 years
0 Lacs
India
On-site
Job Description We are seeking a highly experienced Senior Data Modeler with strong expertise in Data Vault modeling and data architecture. The ideal candidate will be responsible for analyzing complex business requirements and designing scalable and efficient data models that align with organizational goals. Key Responsibilities: Analyze and translate business requirements into long-term data solutions. Design and implement conceptual, logical, and physical data models. Develop and apply transformation rules to ensure accurate data mapping across systems. Collaborate with development teams to define data flows and modeling strategies. Establish best practices for data design, coding, and documentation. Review and enhance existing data models for performance and compatibility. Optimize local and metadata models to improve system efficiency. Apply canonical modeling techniques to ensure data consistency. Troubleshoot and fine-tune data models for optimal performance. Conduct regular assessments of data systems for accuracy, variance, and performance. Technical Skills Required: Proven experience in Data Vault modeling (mandatory). Strong knowledge of relational and dimensional data modeling (OLTP/OLAP). Hands-on experience with modeling tools such as Erwin , ER/Studio , Hackolade , Visio , or Lucidchart . Proficient in SQL and experienced with RDBMS such as Oracle , SQL Server , MySQL , and PostgreSQL . Exposure to NoSQL databases like MongoDB and Cassandra . Experience with data warehouses and BI tools such as Snowflake , Redshift , Databricks , Qlik , and Power BI . Familiarity with ETL processes , data integration , and data governance frameworks . Preferred Qualifications: Minimum 12 years of experience in Data Modeling or Data Engineering. At least 5 years of hands-on experience with relational and dimensional modeling. Strong understanding of metadata management and related tools. Knowledge of transactional databases , data warehousing , and real-time data processing . Experience working with cloud platforms (AWS, Azure, or GCP) and big data technologies (Hadoop, Spark, Databricks). Relevant certifications in Data Management, Data Modeling, or Cloud Data Engineering are a plus. Excellent communication, presentation, and interpersonal skills. Show more Show less
Posted 1 week ago
7.0 years
2 - 7 Lacs
Bengaluru
On-site
Job Summary The Senior ETL Developer will be a member of a global team with a key emphasis on providing development and data integration expertise for SAP Data Services and the ETL process. This role will provide technical leadership to Data Analytics Analysts and Developers to establish best practices, ensuring efficient, and scalable ETL workflows that support business intelligence and data reporting needs. This individual will design and deliver the end-to-end ETL process and Data Analytics technology infrastructure that will feed data to dashboards, scorecards, standard reports, and ad hoc reports. The individual has proven experience providing complex technology solutions, in both SAP Business Object Data Services (BODS) and Power BI (PBI), to support key business processes and providing troubleshooting support within a global manufacturing environment. This individual will report to the Manager of the Data and Analytics Team. This role will be based at our Bangalore office. Principle Duties and Responsibilities Develop robust and scalable ETL process in SAP Business Objects Data Services (BODS) for source SAP and non-SAP systems and target OLAP systems (SQL, etc.). Design, estimate and create project plan for development, testing, and implementation of ETL process and related tasks. Manage and maintain the BODS platform, including installation, configuration, upgrades, patching, and monitoring. Monitor BODS jobs, perform fast troubleshooting and root cause analysis, and provide fast turnaround with a resolution of job failure and any other issues in the BODS production system. Identify opportunities for enhancements to ETL process; work closely with business and technology partners to seek and provide effective resolution to business issues. Create documentation to assist business users and IT members in designing & effectively using the solutions developed. Develop and maintain comprehensive documentation for ETL processes, workflows, and BODS administration procedures. Lead ETL development, providing training, technical guidance, and ensuring best practices in ETL development. Ability to quickly learn reports development in Power BI and other analytics applications. Knowledge, Skills and Abilities 7-10 years of demonstrated technical mastery of Design, Development, Deployment, Administration of SAP Business Objects Data Services and MSFT ETL applications. 5+ years of Data Warehouse and Data Integration experience working with SAP (ECC6), SQL, and other data warehouse & OLAP applications. Strong development and implementation expertise in SAP Information Steward and Data Quality, experienced in master data management and governance creating, publishing and maintaining data quality rules & scorecards. Designing complex SAP Data Services job flows to extract from and load to SAP systems and SQL Servers. Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Performance Tuning and System Testing Strong knowledge of BODS scheduling and Management Console. Expert with Data Integration transforms such as Query, Validation, Case transforms as well as Data Quality transforms such as Match, Associate & Data Cleanse and other transforms. Configuring BODS components including job server, repositories and service designers. Deep understanding of enterprise data warehousing best practices and standards. Strong expertise with SQL Scripting, creating SSIS packages and DB migrations. Strong understanding and knowledge of SAP FICO, SD, MM/Pur, PP data and tables. Experience with creating and maintaining SQL servers and databases. Experience in creating the technical design, architecture, and data flow diagrams for BI and analytics applications. Experience with Azure services like Azure Data Factory, Azure SQL Database, or Azure Synapse Analytics Education and Experience B. Tech/B. E/MCA/ / master’s in business systems Analysis in relevant stream through regular course from recognized university and institute in India 7-10 years of relevant experience in SAP BODS and ETL applications and working in global organization one or more business intelligence certifications (SAP, Microsoft SQL/Azure, GCP, etc.)
Posted 1 week ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Celeros Flow Technology, LLC Job Summary The Senior ETL Developer will be a member of a global team with a key emphasis on providing development and data integration expertise for SAP Data Services and the ETL process. This role will provide technical leadership to Data Analytics Analysts and Developers to establish best practices, ensuring efficient, and scalable ETL workflows that support business intelligence and data reporting needs. This individual will design and deliver the end-to-end ETL process and Data Analytics technology infrastructure that will feed data to dashboards, scorecards, standard reports, and ad hoc reports. The individual has proven experience providing complex technology solutions, in both SAP Business Object Data Services (BODS) and Power BI (PBI), to support key business processes and providing troubleshooting support within a global manufacturing environment. This individual will report to the Manager of the Data and Analytics Team. This role will be based at our Bangalore office. Principle Duties and Responsibilities Develop robust and scalable ETL process in SAP Business Objects Data Services (BODS) for source SAP and non-SAP systems and target OLAP systems (SQL, etc.). Design, estimate and create project plan for development, testing, and implementation of ETL process and related tasks. Manage and maintain the BODS platform, including installation, configuration, upgrades, patching, and monitoring. Monitor BODS jobs, perform fast troubleshooting and root cause analysis, and provide fast turnaround with a resolution of job failure and any other issues in the BODS production system. Identify opportunities for enhancements to ETL process; work closely with business and technology partners to seek and provide effective resolution to business issues. Create documentation to assist business users and IT members in designing & effectively using the solutions developed. Develop and maintain comprehensive documentation for ETL processes, workflows, and BODS administration procedures. Lead ETL development, providing training, technical guidance, and ensuring best practices in ETL development. Ability to quickly learn reports development in Power BI and other analytics applications. Knowledge, Skills And Abilities 7-10 years of demonstrated technical mastery of Design, Development, Deployment, Administration of SAP Business Objects Data Services and MSFT ETL applications. 5+ years of Data Warehouse and Data Integration experience working with SAP (ECC6), SQL, and other data warehouse & OLAP applications. Strong development and implementation expertise in SAP Information Steward and Data Quality, experienced in master data management and governance creating, publishing and maintaining data quality rules & scorecards. Designing complex SAP Data Services job flows to extract from and load to SAP systems and SQL Servers. Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Performance Tuning and System Testing Strong knowledge of BODS scheduling and Management Console. Expert with Data Integration transforms such as Query, Validation, Case transforms as well as Data Quality transforms such as Match, Associate & Data Cleanse and other transforms. Configuring BODS components including job server, repositories and service designers. Deep understanding of enterprise data warehousing best practices and standards. Strong expertise with SQL Scripting, creating SSIS packages and DB migrations. Strong understanding and knowledge of SAP FICO, SD, MM/Pur, PP data and tables. Experience with creating and maintaining SQL servers and databases. Experience in creating the technical design, architecture, and data flow diagrams for BI and analytics applications. Experience with Azure services like Azure Data Factory, Azure SQL Database, or Azure Synapse Analytics Education and Experience B. Tech/B. E/MCA/ / master’s in business systems Analysis in relevant stream through regular course from recognized university and institute in India 7-10 years of relevant experience in SAP BODS and ETL applications and working in global organization one or more business intelligence certifications (SAP, Microsoft SQL/Azure, GCP, etc.) Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Experience- 5 To 9 Year Location- Pune Job Type- Contract For Client Responsibilities Job Description- Development of high-quality database solutions Develop, implement, and optimize stored procedures and functions using SQL Review and interpret ongoing business report requirements Analyze existing SQL queries for performance improvements Gather user requirements and identify new features Provide data management support to users Ensure all database programs meet company and performance requirements Build appropriate and useful reporting deliverables Suggest new queries Provide timely scheduled management reporting Investigate exceptions about asset movements Mentor junior team members as needed Work with data architects to ensure that solutions are aligned with company-wide technology directions Required Skills Technical Bachelor’s degree in IT, Computer science, or related field 5+ years of experience as a SQL Developer or similar role Strong proficiency with SQL and its variations among popular databases (Snowflake) Strong skills in performance tuning of complex SQL queries, procedure and indexing strategies Experience in designing, OLAP databases using data warehouse patterns and schema’s including facts, dimensions, sorting keys, indexes, constraints etc. Query design and performance tuning of complex queries for very large data sets Knowledge of best practices when dealing with relational databases Capable of troubleshooting common database issues Translating functional and technical requirements into detailed design Data Analysis experience, for example – mapping the source to target rules and fields Click here to apply Apply here Job Category: SQL Developer Job Type: Contract Job Location: Pune Apply for this position Full Name * Email * Phone * Cover Letter * Upload CV/Resume *Allowed Type(s): .pdf, .doc, .docx By using this form you agree with the storage and handling of your data by this website. * Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
With the increasing demand for data analysis and business intelligence, OLAP (Online Analytical Processing) jobs have become popular in India. OLAP professionals are responsible for designing, building, and maintaining OLAP databases to support data analysis and reporting activities for organizations. If you are looking to pursue a career in OLAP in India, here is a comprehensive guide to help you navigate the job market.
These cities are known for having a high concentration of IT companies and organizations that require OLAP professionals.
The average salary range for OLAP professionals in India varies based on experience levels. Entry-level professionals can expect to earn around INR 4-6 lakhs per annum, while experienced professionals with 5+ years of experience can earn upwards of INR 12 lakhs per annum.
Career progression in OLAP typically follows a trajectory from Junior Developer to Senior Developer, and then to a Tech Lead role. As professionals gain experience and expertise in OLAP technologies, they may also explore roles such as Data Analyst, Business Intelligence Developer, or Database Administrator.
In addition to OLAP expertise, professionals in this field are often expected to have knowledge of SQL, data modeling, ETL (Extract, Transform, Load) processes, data warehousing concepts, and data visualization tools such as Tableau or Power BI.
As you prepare for OLAP job interviews in India, make sure to hone your technical skills, brush up on industry trends, and showcase your problem-solving abilities. With the right preparation and confidence, you can successfully land a rewarding career in OLAP in India. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2