Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description: Power Apps & Power BI Engineer Company Overview SyrenCloud Inc. is a leading Data Engineering company that specializes in solving complex challenges in the Supply Chain Management industry. We have a team of over 350 employees and a robust revenue of $25M+. Our mission is to empower organizations with cutting-edge software engineering solutions that optimize operations, harness supply chain intelligence, and drive sustainable growth. We value both growth and employee well-being, striving to maintain a positive work environment while providing opportunities for professional development. Role Summary We are seeking a highly motivated Power Apps & Power BI Engineer to join our Digital Supply Chain team. In this role, you will focus on automating digital supply chain processes and data-driven decision-making using Power Apps and Power BI. You will be responsible for creating dynamic business applications and data visualizations, enabling business leaders, analysts, and partners to make informed decisions based on actionable insights. As a key player in the team, you will collaborate with business stakeholders to design, develop, and implement business intelligence solutions that drive operational excellence. Key Responsibilities Power BI Dashboard Creation: Develop automated reports, KPI scorecards, and visually impactful dashboards using Power BI that address key business questions. Business Requirement Analysis: Engage with stakeholders to understand business needs and translate them into functional specifications for Power Apps and Power BI solutions. Power Apps Development: Design and implement business applications in Power Apps to streamline workflows, enhance data collection, and improve user experience for supply chain processes. Data Modeling & Integration: Develop data models and integrate data from various sources, transforming raw data into actionable insights for reporting and decision-making. Data Security & Compliance: Establish row-level security in Power BI and manage application security models to ensure data integrity and confidentiality. TSQL & Power Query Development: Utilize TSQL, Power Query, MDX, and DAX to extract, manipulate, and model data for Power BI reports and Power Apps applications. Collaboration: Work closely with cross-functional teams to understand reporting requirements, design solutions, and implement dashboards and business apps that meet user needs. System Enhancement: Continuously improve and optimize the business intelligence and application systems to meet evolving business demands. Documentation & Reporting: Create clear, concise documentation for data models, reporting systems, and Power Apps applications, including descriptions of techniques, models, and relationships. Qualifications Experience: Minimum 4+ years of experience as a Power BI Developer or Power Apps Engineer, with a solid understanding of business intelligence, data modeling, and app development. Technical Proficiency: Strong proficiency in Power BI, Power Apps, SQL Server, TSQL, Power Query, MDX, and DAX. Database Knowledge: A solid understanding of relational databases, OLTP/OLAP, and multidimensional database design concepts. Problem Solving: Strong analytical skills with the ability to turn complex business requirements into effective reporting solutions and applications. Communication Skills: Excellent written and verbal communication skills, with the ability to articulate requirements and technical solutions effectively. Agile Environment: Ability to thrive in an agile development environment, managing tight deadlines and shifting priorities. Teamwork: Comfortable working with cross-functional, distributed teams to deliver solutions. What We Offer A dynamic and collaborative work environment. Opportunities for professional growth and career advancement. Competitive salary and benefits package. Exposure to exciting, impactful projects in the supply chain domain. If you’re passionate about leveraging Power Apps and Power BI to drive innovation in supply chain management, we’d love to hear from you! This job description highlights the integration of Power Apps and Power BI for driving business intelligence and application development in Syren Cloud’s Supply Chain team. The focus on automation, collaboration, and strategic data modeling ensures that both tools are leveraged effectively for business growth and innovation. Show more Show less
Posted 1 month ago
12.0 - 15.0 years
0 Lacs
India
On-site
Degree in Computer Science or related disciplines At least 12-15 years of experience in SAP BW, BI, HANA, Business Objects & Power BI, (Implementation/development experience is a must) Knowledge about the integration of Power BI, SAP Business Object tools with SAP NetWeaver BI/BW Strong experience of SAP Business Object version 4.0 or above including Flory versions. Good understanding of Current SAP Business Intelligence solutions using BEX, Analysis for Office, Webi Good knowledge of My SQL, ABAP, SQL Scripts & AMDP Good strong business process knowledge in at least 2 core SAP ECC domains BI Integrations with different tools like IBP, SAP ECC (RFC & ODP), SAP CPIDS, Open Hubs, Power BI etc. Strong Experience with development in the following tools: SAP BusinessObjects Web Intelligence Analytics OLAP Power BI Working experience in Power BI Data modelling Data visualizations Development using DAX query language Integrations with SAP eco systems Oversee all phases of the Software Development Life Cycle (SDLC), ensuring successful end-to-end delivery. Core Competencies At least 12-15 years of experience in SAP BI design & development based on best practices BW Modelling Native Hana Mixed modelling scenarios BOBJ – AFO, Webi ABAP, AMDP’s, HANA Scripts Logistics, Finance, Sales, Procurement experience Power BI BW4HANA will be added advantage. Must have team management experience to handle 4-6 resources across multiple locations Experienced working in different delivery models i.e., onsite, offshore etc. Ability to provide accurate reporting on periodic basis to key stakeholders Ability & desire to take ownership of BI projects, lead design, build & support. Collaborate closely with Business teams to ensure successful delivery Flexible & adaptable; able to work with tight deadlines Willing to take up new challenges/areas/modules & technological disruptions Show more Show less
Posted 1 month ago
6.0 years
0 Lacs
Hyderābād
On-site
At Apple, new ideas quickly transform into groundbreaking products, services, and customer experiences. Bring passion and dedication to your work, and there’s no telling what can be accomplished. As part of the Supply Chain Innovation team, you will play a pivotal role in building end-to-end, best-in-class software solutions for Apple’s Supply Chain needs, ranging from Supply Planning and Demand Planning to Product Distribution and beyond. You will collaborate with various internal partners to define and implement solutions that optimize Apple’s internal business processes. Description We are seeking an individual who can address challenges and find creative solutions. The ideal candidate should excel in collaborative environments and produce high-quality software under tight deadlines. Must be a self-starter, highly motivated, and able to work independently, while collaborating effectively with multi-functional teams across the globe (US, Singapore, India, and Europe). This role will have a direct impact on Apple’s business, requiring interaction with various internal teams to deliver cutting-edge products in a dynamic, ever-evolving environment. Key Responsibilities: Design, develop, and optimize highly scalable, distributed systems, leveraging cloud-native technologies and micro services architecture to build robust and resilient solutions. Lead proof-of-concept projects and pilot implementations to showcase new ideas. Strive for excellence by continuously seeking ways to enhance system reliability, performance, and security. Contribute to design and code reviews, and assist in debugging and resolving issues. Develop system components and take full responsibility for the timely delivery, quality of the work. Collaborate with product owners, developers, QA, support teams, and end users. Mentor and guide a team of engineers, fostering a culture of innovation and excellence. Tackle complex technical challenges, drive innovation, and stay up-to-date with emerging technologies. Collaborate with contractors to ensure successful project execution. Ability to multitask and work independently with minimal supervision. Occasionally, will need to handle application production (warranty) support. Excellent verbal and written communication skills. Minimum Qualifications 6+ years of relevant experience in enterprise-level application development using advanced Oracle database technologies Hands-on experience with large-volume databases for both OLTP and OLAP environments Experience with databases such as SingleStore, Oracle, Snowflake, NoSQL, Graph, etc. Strong expertise in ETL, performance optimization, and maintenance of various solutions based on Oracle databases Strong ability to research, design, and develop complex technical solutions involving multiple technologies Bachelor's / Master’s degree in Computer Science or equivalent Preferred Qualifications Familiarity with Agile project management methodologies Experience with Python, Pandas DataFrames, SQLAlchemy, numpy, etc Experience in consuming and exposing web services (e.g., SOAP, REST) Knowledge of UNIX/Linux platforms and scripting experience with Shell, XML, JSON AI/ML-related experience A strong understanding of LLMs, prompt engineering and RAG Experience in developing applications for the Supply Chain business domain Submit CV
Posted 1 month ago
0 years
6 - 10 Lacs
Gurgaon
On-site
Ready to build the future with AI? At Genpact, we don’t just keep up with technology—we set the pace. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory, our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what’s possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Senior Principal Consultant, Automation Test Lead! Responsibilities Understand the need of the requirement beyond its face value, design a proper machine-executable automation solution using Python scripts. you will be getting the requirement of Business Rules or Automation test scenario from business or QA team to Automate using Python and SQL, you will not be responsible for writing test case. Implement the re-useable solution following best practice, and delivery the automation results on time. Maintaining, troubleshooting, and optimise existing solution Collaborate with various disciplinary teams to align automation solution to boarder engineering community. Documentation. Lead, coordinate and guide the ETL Manual and automation testers. You may get a change to learn new technologies as well on cloud. Tech Stack (as of now) 1. Redshift 2. Aurora (postgresql) 3. S3 object storage 4. EKS / ECR 5. SQS/SNS 6. Roles/Policies 7. Argo 8. Robot Framework 9. Nested JSON Qualifications we seek in you! Minimum Qualifications 1. Python scripting. Candidate should be strong on python programming design / Pandas / processes / http requests like protocols 2. SQL technologies. (best in postgresql ) : OLTP/ OLAP / Join/Group/aggregation/windows functions etc. 3. Windows / Linux Operation systems basic command knowledge 4. Git usage. understand version control systems, concepts like git branch/pull request/ commit / rebase/ merge 6. SQL Optimization knowledge is plus 7. Good understand and experience in data structure related work. Preferred Qualifications Good to Have as Python code to be deploy using these framework 1. Docker is a plus. understanding about the images/container concepts. 2. Kubernetes is a plus. understanding the concepts and theory of the k8s, especially pods / env etc. 3. Argo workflow / airflow is a plus. 4. Robot Framework is a plus. 5. Kafka is a plus. understand the concept for kafka, and event driven method. Why join Genpact? Lead AI-first transformation – Build and scale AI solutions that redefine industries Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career—Gain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best – Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI – Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Senior Principal Consultant Primary Location India-Gurugram Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 9, 2025, 3:51:48 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time
Posted 1 month ago
3.0 - 6.0 years
3 - 4 Lacs
Bengaluru
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description (Shaded areas for Talent use only) We are seeking a passionate data analyst to transform data into actionable insights and support decision-making in a global organization focused on pricing and commercial strategy. This role spans business analysis, requirements gathering, data modeling, solution design, and visualization using modern tools. The analyst will also maintain and improve existing analytics solutions, interpret complex datasets, and communicate findings clearly to both technical and non-technical audiences. Essential Functions of the Job: (Identify and describe essential functions, or primary duties and responsibilities. Each function should describe WHAT is done and the END RESULT/PURPOSE achieved. Assume that the reader does not know the role or function of the job.) Analyze and interpret structured and unstructured data using statistical and quantitative methods to generate actionable insights and ongoing reports. Design and implement data pipelines and processes for data cleaning, transformation, modeling, and visualization using tools such as Power BI, SQL, and Python. Collaborate with stakeholders to define requirments, prioritize business needs, and translate problems into analytical solutions. Develop, maintain, and enhance scalable analytics solutions and dashboards that support pricing strategy and commercial decision-making. Identify opportunities for process improvement and opperational efficiency through data-driven recommendations. Communicate complex findings in a clear, compelling, and actionable manner to both technical and non-technical audiences. Analytical/Decision Making Responsibilities: (Describe the kind of problems and challenges typically faced, and decisions required to perform the job, as well as recommendations made to supervisors or others. Focus on the nature of existing policies, precedents and procedures used to guide decisions, and the degree to which the incumbent is free to make decisions requiring interpretation and judgment. Provide an example.) Apply a hypothesis-driven approach to analyzing ambiguous or complex data and synthesizing insights to guide strategic decisions. Promote adoption of best practices in data analysis, modeling, and visualization, while tailoring approaches to meet the unique needs of each project. Tackle analytical challenges with creativity and rigor, balancing innovative thinking with practical problem-solving across varied business domains. Prioritize work based on business impact and deliver timely, high-quality results in fast-paced environments with evolving business needs. Demonstrate sound judgement in selecting methods, tools, and data sources to support business objectives. Knowledge and Skills Requirements: (Describe the knowledge or skills needed to perform this job; these may be professional, technical, or managerial) Proven experience as a data analyst, business analyst, data engineer, or similar role. Strong analytical skills with the ability to collect, organize, analyze, and present large datasets accurately. Foundational knowledge of statistics, including concepts like distributions, variance, and correlation. Skilled in documenting processes and presenting findings to both technical and non-technical audiences. Hands-on experience with Power BI for designing, developing, and maintaining analytics solutions. Proficient in both Python and SQL, with strong programming and scripting skills. Skilled in using Pandas, T-SQL, and Power Query M for querying, transforming, and cleaning data. Hands-on experience in data modeling for both transactional (OLTP) and analytical (OLAP) database systems. Strong visualization skills using Power BI and Python libraries such as Matplotlib and Seaborn. Experience with defining and designing KPIs and aligning data insights with business goals. Additional/Optional Knowledge and Skills: (Describe any additional knowledge or skills that, while not required, may be useful or helpful to perform this job; these may be professional, technical, or managerial) Experience with the Microsoft Fabric data analytics environment. Proficiency in using the Apache Spark distributed analytics engine, particularly via PySpark and Spark SQL. Exposure to implementing machine learning or AI solutions in a business context. Familiarity with Python machine learning libraries such as scikit-learn, XGBoost, PyTorch, or transformers. Experience with Power Platform tools (Power Apps, Power Automate, Dataverse, Copilot Studio, AI Builder). Knowledge of pricing, commercial strategy, or competitive intelligence. Experience with cloud-based data services, particularly in the Azure ecosystem (e.g., Azure Synapse Analytics or Azure Machine Learning). Supervision Responsibilities: (Describe the level of supervision received, i.e., the frequency of supervisory contact, degree to which the individual acts independently and on what kinds of issues. Describe the level of supervision of others, if any, i.e., assigning work, reviewing performance, direct or indirect responsibility). Operates with a high degree of independence and autonomy. Collaborates closesly with cross-functional teams including sales, pricing, and commercial strategy. Mentors junior team members, helping develop technical skills and business domain knowledge. Other Requirements: (Describe any miscellaneous functions or expectations of the job that are important to note) Collaborates with a team operating primarily in the Eastern Time Zone (UTC 4:00 / 5:00). Limited travel may be required for this role. Job Requirements: Education: (What is the minimum level of education or equivalent experience needed/suggested to perform this job effectively?) A bachelor’s degree in a STEM field relevant to data analysis, data engineering, or data science is required. Examples include (but are not limited to) computer science, statistics, data analytics, artificial intelligence, operations research, or econometrics. Experience: (What is the minimum number or range of years needed to perform this job?) 3–6 years of experience in data analysis, data engineering, or a closely related field, ideally within a professional services enviornment. Certification Requirements: (Describe and explain any certifications and/or licenses needed or helpful to perform this job). No certifications are required for this role. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
2.0 - 9.0 years
4 - 5 Lacs
Coimbatore
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY_ Consulting _Data Analytics Testing_Staff The opportunity As a Data Analytics Test Engineer, you will be responsible for testing Business Intelligence & Data warehousing Solutions both in on premise and cloud platform and should ensure Quality of deliverables. You will work closely with Test Lead for the Projects under Test. Testing proficiency in ETL, data-warehousing, and Business Intelligence area are required for this position. Added advantage to have experience in testing Big Data/unstructured data using Hadoop/Spark framework, cloud platform knowledge either in AWS/Azure, knowledge in predictive analytics, Machine Learning and Artificial intelligence. Skills and attributes for success Delivery of Testing needs for BI & DWH Projects. Ability to effectively communicate with team members across geographies effectively Perform unstructured data / big data testing both in on-premise and cloud platform. Thorough understanding of Requirements and provide feedback on the requirements. Develop Test Strategy for Testing BI & DWH Projects for various aspects like ETL testing & Reports testing (Front end and Backend Testing), Integration Testing and UAT as needed. Provide inputs for Test Planning aligned with Test Strategy. Perform Test Case design, identify opportunity for Test Automation. Develop Test Cases both Manual and Automation Scripts as required. Ensure Test readiness (Test Environment, Test Data, Tools Licenses etc) Perform Test execution and report the progress. Report defects and liaise with development & other relevant team for defect resolution. Prepare Test Report and provide inputs to Test Lead for Test Sign off/ Closure Provide support in Project meetings/ calls with Client for status reporting. Provide inputs on Test Metrics to Test Lead. Support in Analysis of Metric trends and implementing improvement actions as necessary. Handling changes and conducting Regression Testing Generate Test Summary Reports Co-coordinating Test team members and Development team Interacting with client-side people to solve issues and update status Actively take part in providing Analytics and Advanced Analytics Testing trainings in the company To qualify for the role, you must have BE/BTech/MCA/M.Sc Overall 2 to 9 years of experience in Testing Data warehousing / Business Intelligence solutions, minimum 2 years of experience in Testing BI & DWH technologies and Analytics applications. Experience in Bigdata testing with Hadoop/Spark framework and exposure to predictive analytics testing. Very good understanding of business intelligence concepts, architecture & building blocks in areas ETL processing, Datawarehouse, dashboards and analytics. Experience in cloud AWS/Azure infrastructure testing is desirable. Knowledge on python data processing is desirable. Testing experience in more than one of these areas- Data Quality, ETL, OLAP, Reports Good working experience with SQL server or Oracle database and proficiency with SQL scripting. Experience in backend Testing of Enterprise Applications/ Systems built on different platforms including Microsoft .Net and Sharepoint technologies Experience in ETL Testing using commercial ETL tools is desirable. Knowledge/ experience in SSRS, Spotfire (SQL Server Reporting Services) and SSIS is desirable. Experience/ Knowledge in Data Transformation Projects, database design concepts & white-box testing is desirable. Ideally, you’ll also have Able to contribute as an individual contributor and when required Lead a small Team Able to create Test Strategy & Test Plan for Testing BI & DWH applications/ solutions that are moderate to complex / high risk Systems Design Test Cases, Test Data and perform Test Execution & Reporting. Should be able to perform Test Management for small Projects as and when required Participate in Defect Triaging and track the defects for resolution/ conclusion Experience/ exposure to Test Automation and scripting experience in perl & shell is desirable Experience with Test Management and Defect Management tools preferably HP ALM Good communication skills (both written & verbal) Good understanding of SDLC, test process in particular Good analytical & problem solving or troubleshooting skills Good understanding of Project Life Cycle and Test Life Cycle. Exposure to CMMi and Process improvement Frameworks is a plus. Should have excellent communication skills & should be able to articulate concisely & clearly Should be ready to do an individual contributor as well as Team Leader role What working at EY offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
6.0 years
3 - 6 Lacs
Calcutta
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – Senior - IICS Developer EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity A Senior Designer and Developer working with Informatica Intelligent Cloud Services (IICS) in roles involving multiple sources such as files and tables typically has a broad set of responsibilities centered around designing, developing, and managing complex data integration workflows. Their role spans across multiple data sources, including databases, files, cloud storage, and APIs, to ensure seamless data movement and transformation for analytics and business intelligence purposes. Key Roles and Responsibilities of an IICS Senior Designer and Developer Designing and Developing Data Integration Solutions Develop and design ETL (Extract, Transform, Load) mappings and workflows using Informatica Cloud IICS, integrating data from various sources such as files, multiple database tables, cloud storage, and APIs through ODBC and REST connectors. Configure synchronization tasks that may involve multiple database tables as sources, ensuring efficient data extraction and loading. Build reusable, parameterized mapping templates to handle different data loads including full, incremental, and CDC (Change Data Capture) loads. Handling Multiple Data Sources Work with structured, semi-structured, and unstructured data sources including Oracle, SQL Server MI, Azure Data Lake, Azure Blob Storage, Sales Force Net zero, Snowflake, and other cloud/on-premises platforms. Manage file ingestion tasks to load large datasets from on-premises systems to cloud data lakes or warehouses. Use various cloud connectors and transformations (e.g., Aggregator, Filter, Joiner, Lookup, Rank, Router) to process and transform data efficiently. Data Quality, Governance, and Documentation Implement data quality and governance policies to ensure data accuracy, integrity, and security throughout the data integration lifecycle. Create and maintain detailed documentation such as source-to-target mappings, ETL design specifications, and data migration strategies. Develop audit frameworks to track data loads and support compliance requirements like SOX. Project Planning and Coordination Plan and monitor ETL development projects, coordinate with cross-functional teams including system administrators, DBAs, data architects, and analysts to align on requirements and deliverables. Communicate effectively across organizational levels to report progress, troubleshoot issues, and coordinate deployments. Performance Tuning and Troubleshooting Optimize ETL workflows and mappings for performance, including tuning SQL/PLSQL queries and Informatica transformations. Troubleshoot issues using IICS frameworks and collaborate with Informatica support as needed. Leadership and Mentoring (Senior Role Specific) Oversee design and development efforts, review work of junior developers, and ensure adherence to best practices and standards. Lead the creation of ETL standards, naming conventions, and methodologies to promote consistency and reusability across projects. Summary of Skills and Tools Commonly Used Informatica Intelligent Cloud Services (IICS), Informatica Cloud Data Integration (CDI) Should be having 6-9 years of experience SQL MI, PL/SQL, API integrations (REST V2), ODBC connections, Flat Files , ADLS, Sales Force Netzero Cloud platforms: Azure Data Lake, Azure Synapse (SQL Data Warehouse), Snowflake, AWS Redshift Data modelling and warehousing concepts including OLAP, Star and Snowflake schemas Data quality tools and scripting languages such as Python, R, or SAS for advanced analytics support Project management and documentation tools, strong communication skills In essence, a Senior IICS Designer and Developer role is a blend of technical expertise in data integration across multiple heterogeneous sources (files, tables, APIs), project leadership, and ensuring high-quality, scalable data pipelines that support enterprise BI and analytics initiatives. What we look for A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What working at EY offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Greetings from TCS!!! TCS has been a great pioneer in feeding the fire of young Techies like you. We are a global leader in the technology arena and there’s nothing that can stop us from growing together. Your role is of key importance, as it lays down the foundation for the entire project. Make sure you have a valid EP number before interview. To create an EP Number, please visit https://ibegin.tcs.com/iBegin/register Kindly complete the registration if you have not done it yet. Position: Snowflake Developer Job Location: Chennai Experience: 4 + years Job Title: Snowflake Developer Desired Competencies (Technical/Behavioral Competency) Must-Have** Snowflake SQL Writing SQL queries against Snowflake Developing scripts Unix, Python, etc. to do Extract, Load, and Transform data . Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. . In-depth understanding of Data Warehouse/ODS, ETL concept and modeling structure principles . Experience in Data warehousing - OLTP, OLAP, Dimensions, Facts, and Data modeling. . Experience gathering and analyzing system requirements . Good working knowledge of any ETL tool (Informatica or SSIS) Good-to-Have Good to have familiarity with data visualization tools (Tableau/Power BI) Good to have exposure to AWS / Azure Data ecosystem TCS Eligibility Criteria: *BE/B.tech/MCA/M.Sc./MS with minimum 3 years of relevant IT-experience post Qualification. *Only Full-Time courses would be considered. *Candidates who have attended TCS interview in the last 1 months need not apply. Referrals are always welcome!!! Kindly don't apply if already attended interview in last 1 months. Please apply only if you are interested to attend the Walk-in Thanks & Regards Parvathy Show more Show less
Posted 1 month ago
8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Position - Technical Lead Location - Bangalore/Pune/Hyderabad/Gurugram/Kolkata/Chennai/Mumbai Experience - 8+ Years ABOUT HASHEDIN We are software engineers who solve business problems with a Product Mindset for leading global organizations. By combining engineering talent with business insight, we build software and products that can create new enterprise value. The secret to our success is a fast-paced learning environment, an extreme ownership spirit, and a fun culture. WHY SHOULD YOU JOIN US? With the agility of a start-up and the opportunities of an enterprise , every day at HashedIn, your work will make an impact that matters. So, if you are a problem solver looking to thrive in a dynamic fun culture of inclusion, collaboration, and high performance – HashedIn is the place to be! From learning to leadership, this is your chance to take your software engineering career to the next level. So, what impact will you make? Visit us @ https://hashedin.com JOB TITLE: Data Integration Tech Lead (Oracle ODI) We are seeking an energetic and technically proficient Data Integration Tech Lead to design, build, and optimize robust data integration and analytics solutions using the Oracle technology stack. This role puts you at the core of our enterprise data modernization efforts, responsible for designing, implementing, and maintaining end-to-end data integration pipelines across traditional and cloud platforms. You will leverage your expertise in Oracle Data Integrator (ODI), Oracle Integration Cloud (OIC), and related technologies to drive efficient data movement, transformation, and loading while maintaining the highest standards of data quality, lineage, and governance. You will work hands-on and lead a small team of developers, shaping best practices for data integration workflows and collaborating with Analytics/BI teams to deliver fit-for-purpose solutions. Mandatory Skills: Experience: 6–8 years of progressive experience in enterprise data integration, with at least 4 years hands-on experience in Oracle Data Integrator (ODI). Strong understanding and working experience with Oracle Integration Cloud (OIC), Oracle databases, and related cloud infrastructure. Proven track record in designing and implementing large-scale ETL/ELT solutions across hybrid (on-prem/cloud) architectures. Technical Proficiency: Deep hands-on expertise with ODI components (Topology, Designer, Operator, Agent) and OIC (Integration patterns, adapters, process automation). Strong command of SQL and PL/SQL for data manipulation and transformation. Experience with REST/SOAP APIs, batch scheduling, and scripting (Python, Shell, or similar) for process automation. Data modeling proficiency (logical/physical, dimensional, OLAP/OLTP). Familiarity with Oracle Analytics Cloud (OAC), OBIEE, and integration into analytics platforms. Solid understanding of data quality frameworks, metadata management, and lineage documentation. Setting up topology, building objects in Designer, Monitoring Operator, different type of KMs, Agents etc Packaging components, database operations like Aggregate pivot, union etc. Using ODI mappings, error handling, automation using ODI, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Expertise in developing Load Plans, Scheduling Jobs Ability to design data quality and reconciliation framework using ODI Integrate ODI with multiple Source / Target Show more Show less
Posted 1 month ago
4.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Dear Associate Greetings from TATA Consultancy Services!! Thank you for expressing your interest in exploring a career possibility with the TCS Family. We have a job opportunity for ETL Test Engineer at Tata Consultancy Services. Hiring For: ETL Test Engineer Interview date: 14-June-25 In-person Drive Location: Bangalore Experience: 4-10 years Must Have: 1. SQL - Expert level of knowledge in core concepts of SQL and query. 2.Lead and mentor a team of ETL testers, providing technical guidance, training, and support in ETL tools, SQL, and test automation frameworks. 3.Create and review complex test cases, test scripts, and test data for ETL processes. 4. ETL Automation - Experience in Datagap, Good to have experience in tools like Informatica, Talend and Ab initio. 5.Execute test cases, validate data transformations, and ensure data accuracy and consistency across source and target systems 6. Experience in query optimization, stored procedures/views and functions. 7. Strong familiarity of data warehouse projects and data modeling. 8. Understanding of BI concepts - OLAP vs OLTP and deploying the applications on cloud servers. 9. Preferably good understanding of Design, Development, and enhancement of SQL server DW using tools (SSIS,SSMS, Power BI/Cognos/Informatica, etc.) 10.Develop and maintain ETL test automation frameworks to enhance testing efficiency and coverage. 11. Integrate automated tests into the CI/CD pipeline to ensure continuous validation of ETL processes. 12. Azure DevOps/JIRA - Hands on experience on any test management tools preferably ADO or JIRA. 13. Agile concepts - Good experience in understanding agile methodology (scrum, lean etc.) 14. Communication - Good communication skills to understand and collaborate with all the stake holders within the project If you are interested in this exciting opportunity, please share your updated resume on saranya.devi3@tcs.com along with the additional information mentioned below: Name: Preferred Location: Contact No: Email id: Highest Qualification: University/Institute name: Current Organization Willing to relocate Bangalore : Total Experience: Relevant Experience in (ETL Test Engineer): Current CTC: Expected CTC: Notice Period: Gap Duration: Gap Details: Available for In-Person interview on 14-June-25: Timings: Attended interview with TCS in past(details): Please share your I begin portal EP id if already registered: Note: only Eligible candidates with Relevant experience will be contacted further. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Role Summary The AI, Data, and Analytics (AIDA) organization team, a Pfizer Digital organization, is responsible for the development and management of all data and analytics tools and platforms across the enterprise – from global product development, to manufacturing, to commercial, to point of patient care across over 100+ countries. One of the team’s top priorities is the development of Business Intelligence (BI), Reporting, and Visualization products which will serve as an enabler for the company’s digital transformation to bring innovative therapeutics to patients. We are looking for a technically skilled and experienced Reporting Engineering Manager who is passionate about developing BI and data visualization products for our Customer Facing and Sales Enablement Colleagues, totaling over 20,000 individuals. This role involves working across multiple business segments globally to deliver top-tier BI Reporting and Visualization capabilities that enable impactful business decisions and high engagement user experiences. This role will work across multiple business segments globally to deliver best in class BI Reporting and Visualization capabilities that enable impactful business decisions and cohesive high engagement user experiences. In this position, you will be accountable to have a thorough understanding of data, business, and analytic requirements to deliver high-impact, relevant interactive data visualizations products that drive company performance through continuously monitoring, measuring, identifying root cause, and proactively identifying patterns and triggers across the company to optimize performance. This role will also drive best practices and standards for BI & Visualization. This role will work closely with stakeholders to understand their needs and ensure that reporting assets are created with a focus on Customer Experience. This role requires working with complex and advanced data environments, employing the right architecture to build scalable semantic layers and contemporary reporting visualizations. The Reporting Manager will ensure data quality and integrity by validating the accuracy of KPIs and insights, resolving anomalies, implementing data quality checks, and conducting system integration testing (SIT) and user acceptance testing (UAT). The ideal candidate is a passionate and results-oriented product lead with a proven track record of delivering data and analytics driven solutions for the pharmaceutical industry. Role Responsibilities Engineering expert in business intelligence and data visualization products in service of field force and HQ enabling functions. Act as a Technical BI & Visualization developer on projects and collaborate with global team members (e.g. other engineers, regional delivery and activation teams, vendors) to architect, design and create BI & Visualization products at scale. Thorough understanding of data, business, and analytic requirements (incl. BI Product Blueprints such as SMART) to deliver high-impact, relevant data visualizations products while respecting project or program budgets and timelines. Deliver quality Functional Requirements and Solution Design, adhering to established standards and best practices. Follow Pfizer Process in Portfolio Management, Project Management, Product Management Playbook following Agile, Hybrid or Enterprise Solution Life Cycle. Extensive technical and implementation knowledge of multitude of BI and Visualization platforms not limiting to Tableau, MicroStrategy, Business Objects, MS-SSRS, and etc. Experience of cloud-based architectures, cloud analytics products / solutions, and data products / solutions (eg: AWS Redshift, MS SQL, Snowflake, Oracle, Teradata). Qualifications Bachelor’s degree in a technical area such as computer science, engineering, or management information science. Recent Healthcare Life Sciences (pharma preferred) and/or commercial/marketing data experience is highly preferred. Domain knowledge in the pharmaceutical industry preferred. Good knowledge of data governance and data cataloging best practices. Relevant experience or knowledge in areas such as database management, data quality, master data management, metadata management, performance tuning, collaboration, and business process management. Strong Business Analysis acumen to meet or exceed business requirements following User Center Design (UCD). Strong Experience with testing of BI and Analytics applications – Unit Testing (e.g. Phased or Agile Sprints or MVP), System Integration Testing (SIT) and User Integration Testing (UAT). Experience with technical solution management tools such as JIRA or Github. Stay abreast of customer, industry, and technology trends with enterprise Business Intelligence (BI) and visualization tools. Technical Skillset 5+ years of hands-on experience in developing BI capabilities using Microstrategy Proficiency in common BI tools, such as Tableau, PowerBI, etc.. is a plus. Common Data Model (Logical & Physical), Conceptual Data Model validation to create Consumption Layer for Reporting (Dimensional Model, Semantic Layer, Direct Database Aggregates or OLAP Cubes) Develop using Design System for Reporting as well as Adhoc Analytics Template BI Product Scalability, Performance-tuning Platform Admin and Security, BI Platform tenant (licensing, capacity, vendor access, vulnerability testing) Experience in working with cloud native SQL and NoSQL database platforms. Snowflake experience is desirable. Experience in AWS services EC2, EMR, RDS, Spark is preferred. Solid understanding of Scrum/Agile is preferred and working knowledge of CI/CD, GitHub MLflow. Familiarity with data privacy standards, governance principles, data protection, pharma industry practices/GDPR compliance is preferred. Great communication skills. Great business influencing and stakeholder management skills. Pfizer is an equal opportunity employer and complies with all applicable equal employment opportunity legislation in each jurisdiction in which it operates. Information & Business Tech Show more Show less
Posted 1 month ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Hi {fullName} There is an opportunity for AZURE DATA ENGINEER AT Bangalore for which WALKIN interview is there on 14th Jun 25 between 9:00 AM TO 12:30 PM PLS SHARE below details to mamidi.p@tcs.com with subject line as AZURE DATA ENGINEER 14th Jun 25 if you are interested Email id: Contact no: Total EXP: Preferred Location: CURRENT CTC: EXPECTED CTC: NOTICE PERIOD: CURRENT ORGANIZATION: HIGHEST QUALIFICATION THAT IS FULL TIME : HIGHEST QUALIFICATION UNIVERSITY: ANY GAP IN EDUCATION OR EMPLOYMENT: IF YES HOW MANY YEARS AND REASON FOR GAP: ARE U AVAILABLE FOR WALKIN INTERVIEW AT BANGALORE ON 14TH JUN 25(YES/NO): We will share a mail to you by tom Night if you are shortlisted. 1 Role** Azure Data Engineer Databricks 2 Required Technical Skill Set** Azure SQL, Azure SQL DW, Azure Data Lake Store , Azure Data Factory Must have Implementation, and operations of OLTP, OLAP, DW technologies such as Azure SQL, Azure SQL DW, , Azure Data Lake Store , Azure Data Factory and understanding of Microsoft Azure PaaS features. Azure Cloud ,Azure Databricks, Data Factory knowledge are good to have, otherwise any cloud exposure Ability to gather requirements from client side and explain to tech team members. Resolve conflicts in terms of bandwidth or design issues. Good understanding of data modeling, data analysis, data governance. Very good communication skills and client handling skills Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
On-site
Job Duties and Responsibilities: 1. Data Architecture Design Design scalable and secure enterprise data architectures for transactional systems, data warehouses, and data lakes. Define data flows, integration strategies, and architecture blueprints to support business intelligence and analytics platforms. 2. Data Modeling Develop logical, physical, and conceptual data models to represent complex business requirements. Create and maintain dimensional models (star/snowflake schemas) and normalized data structures for OLTP and OLAP systems. Use data modeling tools such as ER/Studio, ERwin, or Lucidchart . 3. Data Governance & Quality Define data standards, naming conventions , and metadata strategies to ensure consistency across systems. Collaborate with data stewards to enforce data quality rules , validations, and lineage tracking. Recommend technologies and platforms for data storage, integration, and processing (e.g., Snowflake, Azure Synapse, Redshift, Databricks). Guide data platform modernization and cloud migration strategies, ensuring alignment with enterprise goals. Show more Show less
Posted 1 month ago
4.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Role: ETL Test Engineer Experience range: 4-10 years Location: Current location must be Bangalore ONLY NOTE: Candidate interested for a walk-in drive in Bangalore must apply Job description: 1.Min 4 to 6 yrs of Exp in ETL Testing. 2.SQL - Expert level of knowledge in core concepts of SQL and query. 3. ETL Automation - Experience in Datagap, Good to have experience in tools like Informatica, Talend and Ab initio. 4. Experience in query optimization, stored procedures/views and functions. 5.Strong familiarity of data warehouse projects and data modeling. 6. Understanding of BI concepts - OLAP vs OLTP and deploying the applications on cloud servers. 7.Preferably good understanding of Design, Development, and enhancement of SQL server DW using tools (SSIS,SSMS, PowerBI/Cognos/Informatica, etc.) 8. Azure DevOps/JIRA - Hands on experience on any test management tools preferably ADO or JIRA. 9. Agile concepts - Good experience in understanding agile methodology (scrum, lean etc.) 10.Communication - Good communication skills to understand and collaborate with all the stake holders within the project Show more Show less
Posted 1 month ago
0.0 - 10.0 years
0 Lacs
Indore, Madhya Pradesh
On-site
Indore, Madhya Pradesh, India;Noida, Uttar Pradesh, India Qualification : Job Description: We are looking for GCP Data Engineer and SQL Programmer with good working experience on PostgreSQL, & PL/SQL programming experience and following technical skills PL/SQL and PostgreSQL programming – Ability to write complex SQL Queries, Stored Procedures. Migration – Working experience in migrating Database structure and data from Oracle to Postgres SQL preferably on GCP Alloy DB or Cloud SQL Working experience on Cloud SQL/Alloy DB Working experience to tune autovacuum in postgresql. Working experience on tuning Alloy DB / PostgreSQL for better performance. Working experience on Big Query, Fire Store, Memory Store, Spanner and bare metal setup for PostgreSQL Ability to tune the Alloy DB / Cloud SQL database for better performance Experience on GCP Data migration service Working experience on MongoDB Working experience on Cloud Dataflow Working experience on Database Disaster Recovery Working experience on Database Job scheduling Working experience on Database logging techniques Knowledge of OLTP And OLAP Desirable: GCP Database Engineer Certification Other Skills:- Out of the Box Thinking Problem Solving Skills Ability to make tech choices (build v/s buy) Performance management (profiling, benchmarking, testing, fixing) Enterprise Architecture Project management/Delivery Capabilty/ Quality Mindset Scope management Plan (phasing, critical path, risk identification) Schedule management / Estimations Leadership skills Other Soft Skills Learning ability Innovative / Initiative Skills Required : Postgresql, plsql, Bigquery Role : Develop, construct, test, and maintain data architectures Migrate Enterprise Oracle database from On Prem to GCP cloud autovacuum in postgresql Ability to tune autovacuum in postgresql. Working on tuning Alloy DB / PostgreSQL for better performance. Performance Tuning of PostgreSQL stored procedure code and queries Converting Oracle stored procedure & queries to PostgreSQL stored procedures & Queries Creating Hybrid data store with Datawarehouse and No SQL GCP solutions along with PostgreSQL. Migrate Oracle Table data from Oracle to Alloy DB Leading the database team Experience : 7 to 10 years Job Reference Number : 12779
Posted 1 month ago
0.0 - 6.0 years
0 Lacs
Indore, Madhya Pradesh
On-site
Indore, Madhya Pradesh, India;Bangalore, Karnataka, India;Noida, Uttar Pradesh, India Qualification : Pre-Sales Solution Engineer - India Experience areas or Skills : Pre-Sales experience of Software or analytics products Excellent verbal & written communication skills OLAP tools or Microsoft Analysis services (MSAS) Data engineering or Data warehouse or ETL Hadoop ecosystem or AWS, Azure or GCP Cluster and processing Tableau or Micro strategy or any BI tool Hive QL or Spark SQL or PLSQL or TSQL Writing and troubleshooting SQL programming or MDX queries Working on Linux programming in Python, Java or Java Script would be a plus Filling RFP or Questioner from Customer NDA, Success Criteria, Project closure and other Documentation Be willing to travel or relocate as per requirement Role : Acts as main point of contact for Customer contacts involved in the evaluation process Product demonstrations to qualified leads Product demonstrations in support of marketing activity such as events or webinars Own RFP, NDA, PoC success criteria document, POC Closure and other documents Secures alignment on Process and documents with the customer / prospect Owns the technical win phases of all active opportunities Understand Customer domain and database schema Providing OLAP and Reporting solution Work closely with customers for understanding and resolving environment or OLAP cube or reporting related issues Co-ordinate with solutioning team for execution of PoC as per success plan Creates enhancement requests or identify requests for new features on behalf of customers or hot prospects Experience : 3 to 6 years Job Reference Number : 10771
Posted 1 month ago
0.0 - 10.0 years
0 Lacs
Noida, Uttar Pradesh
On-site
Noida, Uttar Pradesh, India;Indore, Madhya Pradesh, India;Bangalore, Karnataka, India;Hyderabad, Telangana, India;Gurgaon, Haryana, India Qualification : Required Proven hands-on experience on designing, developing and supporting Database projects for analysis in a demanding environment. Proficient in database design techniques – relational and dimension designs Experience and a strong understanding of business analysis techniques used. High proficiency in the use of SQL or MDX queries. Ability to manage multiple maintenance, enhancement and project related tasks. Ability to work independently on multiple assignments and to work collaboratively within a team is required. Strong communication skills with both internal team members and external business stakeholders Added Advanatage Hadoop ecosystem or AWS, Azure or GCP Cluster and processing Experience working on Hive or Spark SQL or Redshift or Snowflake will be an added advantage. Experience of working on Linux system Experience of Tableau or Micro strategy or Power BI or any BI tools will be an added advantage. Expertise of programming in Python, Java or Shell Script would be a plus Role : Roles & Responsibilities Be frontend person of the world’s most scalable OLAP product company – Kyvos Insights. Interact with senior-most technical and business people of large enterprises to understand their big data strategy and their problem statements in that area. Create, present, align customers with and implement solutions around Kyvos products for the most challenging enterprise BI/DW problems. Be the Go-To person for customers regarding technical issues during the project. Be instrumental in reading the pulse of the big data market and defining the roadmap of the product. Lead a few small but highly efficient teams of Big data engineers Efficient task status reporting to stakeholders and customer. Good verbal & written communication skills Be willing to work on off hours to meet timeline. Be willing to travel or relocate as per project requirement Experience : 5 to 10 years Job Reference Number : 11078
Posted 1 month ago
12.0 years
0 Lacs
India
On-site
Job Description We are seeking a highly experienced Senior Data Modeler with strong expertise in Data Vault modeling and data architecture. The ideal candidate will be responsible for analyzing complex business requirements and designing scalable and efficient data models that align with organizational goals. Key Responsibilities: Analyze and translate business requirements into long-term data solutions. Design and implement conceptual, logical, and physical data models. Develop and apply transformation rules to ensure accurate data mapping across systems. Collaborate with development teams to define data flows and modeling strategies. Establish best practices for data design, coding, and documentation. Review and enhance existing data models for performance and compatibility. Optimize local and metadata models to improve system efficiency. Apply canonical modeling techniques to ensure data consistency. Troubleshoot and fine-tune data models for optimal performance. Conduct regular assessments of data systems for accuracy, variance, and performance. Technical Skills Required: Proven experience in Data Vault modeling (mandatory). Strong knowledge of relational and dimensional data modeling (OLTP/OLAP). Hands-on experience with modeling tools such as Erwin , ER/Studio , Hackolade , Visio , or Lucidchart . Proficient in SQL and experienced with RDBMS such as Oracle , SQL Server , MySQL , and PostgreSQL . Exposure to NoSQL databases like MongoDB and Cassandra . Experience with data warehouses and BI tools such as Snowflake , Redshift , Databricks , Qlik , and Power BI . Familiarity with ETL processes , data integration , and data governance frameworks . Preferred Qualifications: Minimum 12 years of experience in Data Modeling or Data Engineering. At least 5 years of hands-on experience with relational and dimensional modeling. Strong understanding of metadata management and related tools. Knowledge of transactional databases , data warehousing , and real-time data processing . Experience working with cloud platforms (AWS, Azure, or GCP) and big data technologies (Hadoop, Spark, Databricks). Relevant certifications in Data Management, Data Modeling, or Cloud Data Engineering are a plus. Excellent communication, presentation, and interpersonal skills. Show more Show less
Posted 1 month ago
7.0 years
2 - 7 Lacs
Bengaluru
On-site
Job Summary The Senior ETL Developer will be a member of a global team with a key emphasis on providing development and data integration expertise for SAP Data Services and the ETL process. This role will provide technical leadership to Data Analytics Analysts and Developers to establish best practices, ensuring efficient, and scalable ETL workflows that support business intelligence and data reporting needs. This individual will design and deliver the end-to-end ETL process and Data Analytics technology infrastructure that will feed data to dashboards, scorecards, standard reports, and ad hoc reports. The individual has proven experience providing complex technology solutions, in both SAP Business Object Data Services (BODS) and Power BI (PBI), to support key business processes and providing troubleshooting support within a global manufacturing environment. This individual will report to the Manager of the Data and Analytics Team. This role will be based at our Bangalore office. Principle Duties and Responsibilities Develop robust and scalable ETL process in SAP Business Objects Data Services (BODS) for source SAP and non-SAP systems and target OLAP systems (SQL, etc.). Design, estimate and create project plan for development, testing, and implementation of ETL process and related tasks. Manage and maintain the BODS platform, including installation, configuration, upgrades, patching, and monitoring. Monitor BODS jobs, perform fast troubleshooting and root cause analysis, and provide fast turnaround with a resolution of job failure and any other issues in the BODS production system. Identify opportunities for enhancements to ETL process; work closely with business and technology partners to seek and provide effective resolution to business issues. Create documentation to assist business users and IT members in designing & effectively using the solutions developed. Develop and maintain comprehensive documentation for ETL processes, workflows, and BODS administration procedures. Lead ETL development, providing training, technical guidance, and ensuring best practices in ETL development. Ability to quickly learn reports development in Power BI and other analytics applications. Knowledge, Skills and Abilities 7-10 years of demonstrated technical mastery of Design, Development, Deployment, Administration of SAP Business Objects Data Services and MSFT ETL applications. 5+ years of Data Warehouse and Data Integration experience working with SAP (ECC6), SQL, and other data warehouse & OLAP applications. Strong development and implementation expertise in SAP Information Steward and Data Quality, experienced in master data management and governance creating, publishing and maintaining data quality rules & scorecards. Designing complex SAP Data Services job flows to extract from and load to SAP systems and SQL Servers. Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Performance Tuning and System Testing Strong knowledge of BODS scheduling and Management Console. Expert with Data Integration transforms such as Query, Validation, Case transforms as well as Data Quality transforms such as Match, Associate & Data Cleanse and other transforms. Configuring BODS components including job server, repositories and service designers. Deep understanding of enterprise data warehousing best practices and standards. Strong expertise with SQL Scripting, creating SSIS packages and DB migrations. Strong understanding and knowledge of SAP FICO, SD, MM/Pur, PP data and tables. Experience with creating and maintaining SQL servers and databases. Experience in creating the technical design, architecture, and data flow diagrams for BI and analytics applications. Experience with Azure services like Azure Data Factory, Azure SQL Database, or Azure Synapse Analytics Education and Experience B. Tech/B. E/MCA/ / master’s in business systems Analysis in relevant stream through regular course from recognized university and institute in India 7-10 years of relevant experience in SAP BODS and ETL applications and working in global organization one or more business intelligence certifications (SAP, Microsoft SQL/Azure, GCP, etc.)
Posted 1 month ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Celeros Flow Technology, LLC Job Summary The Senior ETL Developer will be a member of a global team with a key emphasis on providing development and data integration expertise for SAP Data Services and the ETL process. This role will provide technical leadership to Data Analytics Analysts and Developers to establish best practices, ensuring efficient, and scalable ETL workflows that support business intelligence and data reporting needs. This individual will design and deliver the end-to-end ETL process and Data Analytics technology infrastructure that will feed data to dashboards, scorecards, standard reports, and ad hoc reports. The individual has proven experience providing complex technology solutions, in both SAP Business Object Data Services (BODS) and Power BI (PBI), to support key business processes and providing troubleshooting support within a global manufacturing environment. This individual will report to the Manager of the Data and Analytics Team. This role will be based at our Bangalore office. Principle Duties and Responsibilities Develop robust and scalable ETL process in SAP Business Objects Data Services (BODS) for source SAP and non-SAP systems and target OLAP systems (SQL, etc.). Design, estimate and create project plan for development, testing, and implementation of ETL process and related tasks. Manage and maintain the BODS platform, including installation, configuration, upgrades, patching, and monitoring. Monitor BODS jobs, perform fast troubleshooting and root cause analysis, and provide fast turnaround with a resolution of job failure and any other issues in the BODS production system. Identify opportunities for enhancements to ETL process; work closely with business and technology partners to seek and provide effective resolution to business issues. Create documentation to assist business users and IT members in designing & effectively using the solutions developed. Develop and maintain comprehensive documentation for ETL processes, workflows, and BODS administration procedures. Lead ETL development, providing training, technical guidance, and ensuring best practices in ETL development. Ability to quickly learn reports development in Power BI and other analytics applications. Knowledge, Skills And Abilities 7-10 years of demonstrated technical mastery of Design, Development, Deployment, Administration of SAP Business Objects Data Services and MSFT ETL applications. 5+ years of Data Warehouse and Data Integration experience working with SAP (ECC6), SQL, and other data warehouse & OLAP applications. Strong development and implementation expertise in SAP Information Steward and Data Quality, experienced in master data management and governance creating, publishing and maintaining data quality rules & scorecards. Designing complex SAP Data Services job flows to extract from and load to SAP systems and SQL Servers. Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Performance Tuning and System Testing Strong knowledge of BODS scheduling and Management Console. Expert with Data Integration transforms such as Query, Validation, Case transforms as well as Data Quality transforms such as Match, Associate & Data Cleanse and other transforms. Configuring BODS components including job server, repositories and service designers. Deep understanding of enterprise data warehousing best practices and standards. Strong expertise with SQL Scripting, creating SSIS packages and DB migrations. Strong understanding and knowledge of SAP FICO, SD, MM/Pur, PP data and tables. Experience with creating and maintaining SQL servers and databases. Experience in creating the technical design, architecture, and data flow diagrams for BI and analytics applications. Experience with Azure services like Azure Data Factory, Azure SQL Database, or Azure Synapse Analytics Education and Experience B. Tech/B. E/MCA/ / master’s in business systems Analysis in relevant stream through regular course from recognized university and institute in India 7-10 years of relevant experience in SAP BODS and ETL applications and working in global organization one or more business intelligence certifications (SAP, Microsoft SQL/Azure, GCP, etc.) Show more Show less
Posted 1 month ago
7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Company Description SPARC CYBERTECH PRIVATE LIMITED is a company based in Hyderabad, Telangana, India. Role Description This is a full-time on-site role for a BA / Developer - Eagle PACE, EAGLE STAR at SPARC CYBERTECH PRIVATE LIMITED located in Pune. The role involves day-to-day tasks related to Business Analysis and Development. Qualifications Analytical Skills, Business Analysis, and Business Process knowledge Strong Communication and Business Requirements understanding Ability to work effectively in a team environment Experience with Eagle PACE and EAGLE STAR software is preferred Bachelor's degree in Computer Science, Information Technology, or related field Job Description: Interact with onsite Technology team, end business users, development team to ensure business requirements are fully understood. Translate business requirements into functional requirements with responsibility to act as SPOC for any query related to requirement for the development or testing team Define functional & non functional requirement for the project which is clearly understood by the development team. Understanding of how GFDR platforms can deliver the project requirements. Understanding of feeds to the system (upstream & downstream ), data owners, Functional relevance of each fund attributes that are stored on the platform. Participate in defining project System design specification ( Author, Review & Sign off) Liaise with stakeholders, system owners or 3rd party for data provision for system testing Provide guidance on data quality and liaison with stakeholders for data completeness Maintain System data dictionary Develop Acceptance criterion for developed code and review test cases including unit test cases and test strategy. Work closely with team to develop proposal for new initiatives Work closely with developers to ensure functional requirement is built into the system as per customer expectation. Provide pre-implementation and post-implementation support to the application Mentor development team on functional requirement Required Skills And Experience 7 years of relevant business/system analysis experience in financial services firm, have performed similar role in prior experience. Graduation in Commerce/Economics or MBA (Finance) preferred. Experience of working with 3rd party financial products Eagle Understanding of various components of Eagle product Uploader, exporter, Panel, OLAP reports etc shiv@sparccybertech.com Show more Show less
Posted 1 month ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
The Applications Development Senior Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Conduct tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establish and implement new or revised applications systems and programs to meet specific business needs or user areas Monitor and control all phases of development process and analysis, design, construction, testing, and implementation as well as provide user and operational support on applications to business users Utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, provide evaluation of business process, system process, and industry standards, and make evaluative judgement Recommend and develop security measures in post implementation analysis of business usage to ensure successful system design and functionality Consult with users/clients and other technology groups on issues, recommend advanced programming solutions, and install and assist customer exposure systems Ensure essential procedures are followed and help define operating standards and processes Serve as advisor or coach to new or lower level analysts Has the ability to operate with a limited level of direct supervision. Can exercise independence of judgement and autonomy. Acts as SME to senior stakeholders and /or other team members. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Develop Marketdata solutions that will be used by Risk and Pnl systems Develop business critical enhancements to Risk and PNL modules Qualifications: 8 + years of experience in software development Strong C# skills Strong analytical and problem solving skills Experience in design and development Be a team player and be able to work effectively in multi-disciplinary teams. Be able to work with low levels of supervision. Excellent C# and .NET skills. TTD,OOAD Good understanding of design patterns. Work Experience in Gemfire, Messaging Systems (EMS/Kafka). Work Experience in SQL Server, OLAP, SSAS. Have knowledge in web development such as ASP.NET, MVC, java script, ajax Very good all-round technology skills. Python (optional but preferred) Education: A good academic background, with at least an Under-graduate degree, preferably in a Technical or Business related subject. This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in infrastructure focus on designing and implementing robust, secure IT systems that support business operations. They enable the smooth functioning of networks, servers, and data centres to optimise performance and minimise downtime. In infrastructure engineering at PwC, you will focus on designing and implementing robust and scalable technology infrastructure solutions for clients. Your work will involve network architecture, server management, and cloud computing experience. Data Modeler Job Description Looking for candidates with a strong background in data modeling, metadata management, and data system optimization. You will be responsible for analyzing business needs, developing long term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise include Analyze and translate business needs into long term solution data models. Evaluate existing data systems and recommend improvements. Define rules to translate and transform data across data models. Work with the development team to create conceptual data models and data flows. Develop best practices for data coding to ensure consistency within the system. Review modifications of existing systems for cross compatibility. Implement data strategies and develop physical data models. Update and optimize local and metadata models. Utilize canonical data modeling techniques to enhance data system efficiency. Evaluate implemented data systems for variances, discrepancies, and efficiency. Troubleshoot and optimize data systems to ensure optimal performance. Strong expertise in relational and dimensional modeling (OLTP, OLAP). Experience with data modeling tools (Erwin, ER/Studio, Visio, PowerDesigner). Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. Experience working with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). Familiarity with ETL processes, data integration, and data governance frameworks. Strong analytical, problem-solving, and communication skills. Qualifications Bachelor's degree in Engineering or a related field. 3 to 5 years of experience in data modeling or a related field. 4+ years of hands-on experience with dimensional and relational data modeling. Expert knowledge of metadata management and related tools. Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. Knowledge of transactional databases and data warehouses. Preferred Skills Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams. Show more Show less
Posted 1 month ago
1.0 - 3.0 years
2 - 5 Lacs
Bengaluru
On-site
Job Information Date Opened 06/06/2025 Job Type Full time Industry IT Services Work Experience 1-3 years City Bengaluru State/Province Karnataka Country India Zip/Postal Code 560027 Job Description We are looking for a Business Intelligence (BI) Developer to create and manage BI and analytics solutions that turn data into knowledge. In this role, you should have a background in data and business analysis. You should be analytical and an excellent communicator. If you also have a business acumen and problem-solving aptitude, we’d like to meet you. Ultimately, you will develop our business intelligence solutions to help our clients make better decisions. Translate business needs to technical specifications Design, build and deploy BI solutions (e.g. reporting tools) Maintain and support data analytics platforms (e.g. MicroStrategy) Create tools to store data (e.g. OLAP cubes) Conduct unit testing and troubleshooting Evaluate and improve existing BI systems Collaborate with teams to integrate systems Develop and execute database queries and conduct analyses Create visualizations and reports for requested projects Develop and update technical documentation Requirements The role requires complete comprehension of enterprise data models and utilization of both relational and dimensional data sources. The ideal candidate should have experience in .Net / Sharepoint / PowerBI design and development with a general understanding of BI and reporting. Experience in user interface design, data sourcing, data transformation, report and dashboard Experience in end-to-end implementation of Business Intelligence (BI) projects, especially in scorecards, KPIs, reports & dashboards Perform detailed analysis of source systems and source system data and model that data • Design, develop, and test scripts to import data from source systems and dashboards to meet customer requirements • Interpret written business requirements and technical specification documents • Create and maintain technical design documentation • Perform quality coding to business and technical specifications • Ensure that server process continues to run and operate in the most efficient manner • Work directly with business units to define, prototype and validate applications • Extracting, transforming and loading data from multiple sources • Monitoring and maintenance of all components that make up the Data Warehouse and Business Intelligence infrastructure • Design, create and tune physical database objects (tables, views, indexes) to support logical and dimensional models; Maintain the referential integrity of the database • Provide input on proposing, evaluating and selecting appropriate design alternatives which meet client requirements and are consistent with clients current standards and processes • Responsible for demonstrating software to internal / Clients Technical team as needed.
Posted 1 month ago
4.0 years
10 Lacs
Noida
On-site
At Cotality, we are driven by a single mission—to make the property industry faster, smarter, and more people-centric. Cotality is the trusted source for property intelligence, with unmatched precision, depth, breadth, and insights across the entire ecosystem. Our talented team of 5,000 employees globally uses our network, scale, connectivity and technology to drive the largest asset class in the world. Join us as we work toward our vision of fueling a thriving global property ecosystem and a more resilient society. Cotality is committed to cultivating a diverse and inclusive work culture that inspires innovation and bold thinking; it's a place where you can collaborate, feel valued, develop skills and directly impact the real estate economy. We know our people are our greatest asset. At Cotality, you can be yourself, lift people up and make an impact. By putting clients first and continuously innovating, we're working together to set the pace for unlocking new possibilities that better serve the property industry. Job Description: In India, we operate as Next Gear India Private Limited, a fully-owned subsidiary of Cotality with offices in Kolkata, West Bengal, and Noida, Uttar Pradesh. Next Gear India Private Limited plays a vital role in Cotality's Product Development capabilities, focusing on creating and delivering innovative solutions for the Property & Casualty (P&C) Insurance and Property Restoration industries. While Next Gear India Private Limited operates under its own registered name in India, we are seamlessly integrated into the Cotality family, sharing the same commitment to innovation, quality, and client success. When you join Next Gear India Private Limited, you become part of the global Cotality team. Together, we shape the future of property insights and analytics, contributing to a smarter and more resilient property ecosystem through cutting-edge technology and insights. QA Automation Engineer As a QA Automation Engineer specializing in Data Warehousing, you will play a critical role in ensuring that our data solutions are of the highest quality. You will work closely with data engineers and analysts to develop, implement, and maintain automated testing frameworks for data validation, ETL processes, data quality, and integration. Your work will ensure that data is accurate, consistent, and performs optimally across our data warehouse systems. Responsibilities Develop and Implement Automation Frameworks : Design, build, and maintain scalable test automation frameworks tailored for data warehousing environments. Test Strategy and Execution : Define and execute automated test strategies for ETL processes, data pipelines, and database integration across a variety of data sources. Data Validation : Implement automated tests to validate data consistency, accuracy, completeness, and transformation logic. Performance Testing : Ensure that the data warehouse systems meet performance benchmarks through automation tools and load testing strategies. Collaborate with Teams : Work closely with data engineers, software developers, and data analysts to understand business requirements and design tests accordingly. Continuous Integration : Integrate automated tests into the CI/CD pipelines, ensuring that testing is part of the deployment process. Defect Tracking and Reporting : Use defect-tracking tools (e.g., JIRA) to log and track issues found during automated testing, ensuring that defects are resolved in a timely manner. Test Data Management : Develop strategies for handling large volumes of test data while maintaining data security and privacy. Tool and Technology Evaluation : Stay current with emerging trends in automation testing for data warehousing and recommend tools, frameworks, and best practices. Job Qualifications: Requirements and skills At Least 4+ Years Experience Solid understanding of data warehousing concepts (ETL, OLAP, data marts, data vault,star/snowflake schemas, etc.). Proven experience in building and maintaining automation frameworks using tools like Python, Java, or similar, with a focus on database and ETL testing. Strong knowledge of SQL for writing complex queries to validate data, test data pipelines, and check transformations. Experience with ETL tools (e.g., Matillion, Qlik Replicate) and their testing processes. Performance Testing Experience with version control systems like Git Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues. Strong communication and collaboration skills. Attention to detail and a passion for delivering high-quality solutions. Ability to work in a fast-paced environment and manage multiple priorities. Enthusiastic about learning new technologies and frameworks. Experience with the following tools and technologies are desired. QLIK Replicate Matillion ETL Snowflake Data Vault Warehouse Design Power BI Azure Cloud – Including Logic App, Azure Functions, ADF Cotality's Diversity Commitment: Cotality is fully committed to employing a diverse workforce and creating an inclusive work environment that embraces everyone’s unique contributions, experiences and values. We offer an empowered work environment that encourages creativity, initiative and professional growth and provides a competitive salary and benefits package. We are better together when we support and recognize our differences. Equal Opportunity Employer Statement: Cotality is an Equal Opportunity employer committed to attracting and retaining the best-qualified people available, without regard to race, ancestry, place of origin, colour, ethnic origin, citizenship, creed, sex, sexual orientation, record of offences, age, marital status, family status or disability. Cotality maintains a Drug-Free Workplace. Please apply on our website for consideration. Privacy Policy Global Applicant Privacy Policy By providing your telephone number, you agree to receive automated (SMS) text messages at that number from Cotality regarding all matters related to your application and, if you are hired, your employment and company business. Message & data rates may apply. You can opt out at any time by responding STOP or UNSUBSCRIBING and will automatically be opted out company-wide. Connect with us on social media! Click on the quicklinks below to find out more about our company and associates
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France