Jobs
Interviews

8529 Data Modeling Jobs - Page 7

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

7 - 11 Lacs

Hyderabad

Work from Office

About the Role As a Data Engineer, you will design, build, and enhance data pipelines and platforms supporting the One Platform/RDS team. You will be responsible for developing scalable data solutions, optimizing data workflows, and ensuring data quality and governance. You will collaborate with cross-functional teams to understand data requirements and deliver robust data infrastructure that supports analytics and business intelligence. You will also contribute to the continuous improvement of data engineering practices and mentor junior team members. About the Team Our team is the One Platform/RDS team, responsible for building and maintaining data platforms and services that support enterprise-wide data needs. We are passionate about data and technology, and we work closely with stakeholders across the organization to deliver high-quality data solutions. As part of a global team, you will have the opportunity to work with diverse technologies and contribute to strategic data initiatives. About You To succeed in this role, you will possess: 5+ years of experience in data engineering with strong proficiency in Python, SQL, Spark. Hands-on experience with cloud platforms such as Azure, AWS, or GCP, with a preference for Azure Data Services. Experience with Databricks, Delta Lake, and building ETL/ELT pipelines for large-scale data processing. Proficiency in data modeling, data warehousing, and performance tuning. Familiarity with data governance, data quality frameworks, and metadata management. Experience with containerization and orchestration tools such as Docker and Kubernetes. Strong understanding of CI/CD pipelines and DevOps practices for data engineering. Ability to collaborate with data scientists, analysts, and business stakeholders to deliver data solutions. Strong analytical and problem-solving skills with a focus on delivering business value. Excellent communication and presentation skills with proficiency in English. Understanding of insurance/reinsurance business is an advantage. Comfort working in a fast-paced, ambiguous environment with a focus on outcomes. About Swiss Re Swiss Re is one of the world s leading providers of reinsurance, insurance and other forms of insurance-based risk transfer, working to make the world more resilient. We anticipate and manage a wide variety of risks, from natural catastrophes and climate change to cybercrime. We cover both Property & Casualty and Life & Health. Combining experience with creative thinking and cutting-edge expertise, we create new opportunities and solutions for our clients. This is possible thanks to the collaboration of more than 14,000 employees across the world. If you are an experienced professional returning to the workforce after a career break, we encourage you to apply for open positions that match your skills and experience. Keywords: Reference Code: 134820

Posted 3 days ago

Apply

3.0 - 8.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Are you passionate about working with Amazon-scale data, analytics, and data scienceDo you love bringing data together from diverse systems and sources and working on critical analytics problems for understanding customer behavior and generating actionable insightsDoes the idea of partnering with a team of highly experienced machine-learning scientists and engineers excite youMinerva Science and Analytics team is looking for an experience, self-driven business-intelligence engineer to help us synthesize data into knowledge across a large number of businesses to help independent authors bring their creativity to customers, detect fraudulent and abusive behavior, and democratize content creation in an safe, efficient, and exciting way. Our team has mature areas and green-field opportunities. We offer technical autonomy, value end-to-end ownership, and have a strong customer-focused culture. Come join us as we revolutionize the book industry and deliver an amazing experience to our authors and readers. As a Business Intelligence Engineer at Amazon, you will connect with world leaders in your field working on similar problems. You will be working with massive-scale data and providing analytic support to scientists, product managers, engineers using these data. You will utilize your deep expertise in business analysis, metrics, reporting, and analytic tooling/languages like SQL, Excel, and others, to translate the data into meaningful insights. You will have ownership of the insights you are building for the business, and will play an integral role in tactical decision-making for critical risk areas. About the team Minerva is a cross-functional team of highly experienced scientists, data engineers, and software engineers with a critical business mission making revolutionary leaps forward using massive-scale data with advanced analytics and machine learning and helping democratize the publishing industry. We build science-based systems for marketing and content-discovery for indie authors, fraud/abuse, and content risk. We also use science to optimize manufacturing, fulfillment, and quality processes for our Print On Demand (POD) business. 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as R, SAS and Matlab Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets

Posted 3 days ago

Apply

1.0 - 6.0 years

8 - 9 Lacs

Bengaluru

Work from Office

IN Data Engineering & Analytics(IDEA) Team is looking to hire a rock star Data Engineer to build and manage the largest petabyte-scale data infrastructure in India for Amazon India businesses. IN Data Engineering & Analytics (IDEA) team is the central Data engineering and Analytics team for all A.in businesses. The teams charter includes 1) Providing Unified Data and Analytics Infrastructure (UDAI) for all A.in teams which includes central Petabyte-scale Redshift data warehouse, analytics infrastructure and frameworks for visualizing and automating generation of reports & insights and self-service data applications for ingesting, storing, discovering, processing & querying of the data 2) Providing business specific data solutions for various business streams like Payments, Finance, Consumer & Delivery Experience. The Data Engineer will play a key role in being a strong owner of our Data Platform. He/she will own and build data pipelines, automations and solutions to ensure the availability, system efficiency, IMR efficiency, scaling, expansion, operations and compliance of the data platform that serves 200 + IN businesses. The role sits in the heart of technology & business worlds and provides opportunity for growth, high business impact and working with seasoned business leaders. An ideal candidate will be someone with sound technical background in managing large data infrastructures, working with petabyte-scale data, building scalable data solutions/automations and driving operational excellence. An ideal candidate will be someone who is a self-starter that can start with a Platform requirement & work backwards to conceive and devise best possible solution, a good communicator while driving customer interactions, a passionate learner of new technology when the need arises, a strong owner of every deliverable in the team, obsessed with customer delight, business impact and gets work done in business time. 1. Design/implement automation and manage our massive data infrastructure to scale for the analytics needs of Amazon IN. 2. Build solutions to achieve BAA(Best At Amazon) standards for system efficiency, IMR efficiency, data availability, consistency & compliance. 3. Enable efficient data exploration, experimentation of large datasets on our data platform and implement data access control mechanisms for stand-alone datasets 4. Design and implement scalable and cost effective data infrastructure to enable Non-IN(Emerging Marketplaces and WW) use cases on our data platform 5. Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL, Amazon and AWS big data technologies 6. Must possess strong verbal and written communication skills, be self-driven, and deliver high quality results in a fast-paced environment. 7. Drive operational excellence strongly within the team and build automation and mechanisms to reduce operations 8. Enjoy working closely with your peers in a group of very smart and talented engineers. A day in the life India Data Engineering and Analytics (IDEA) team is central data engineering team for Amazon India. Our vision is to simplify and accelerate data driven decision making for Amazon India by providing cost effective, easy & timely access to high quality data. We achieve this by providing UDAI (Unified Data & Analytics Infrastructure for Amazon India) which serves as a central data platform and provides data engineering infrastructure, ready to use datasets and self-service reporting capabilities. Our core responsibilities towards India marketplace include a) providing systems(infrastructure) & workflows that allow ingestion, storage, processing and querying of data b) building ready-to-use datasets for easy and faster access to the data c) automating standard business analysis / reporting/ dash-boarding d) empowering business with self-service tools to manage data and generate insights. 1+ years of data engineering experience Experience with SQL Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc.

Posted 3 days ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Bengaluru

Work from Office

We are looking to hire an insightful, results-oriented Business Intelligence Engineer to produce and drive analyses for Worldwide Operations Security (WWOS) Team in Amazon. To keep our operations network secure and assure operational continuity, we are seeking an experienced professional who wants to join our Business Insights team. This role involves translating broad business problems into specific analytics projects, conducting deep quantitative analyses, and communicating results effectively. Design and implement scalable data infrastructure solutions Create and maintain data pipelines for metric tracking and reporting Develop analytical models to identify Theft/Fraud trends and patterns Partner with stakeholders to translate business needs into analytical solutions Build and maintain data visualization dashboards for operational insights A day in the life As a Business Intelligence Engineer I, you will collaborate with cross-functional teams to design and implement data solutions that drive business decisions. Your day might include analysing Theft & Fraud patterns, building automated reporting systems, or presenting insights to stakeholders. Youll work with petabyte-scale data sets and have the opportunity to influence strategic decisions through your analysis. About the team We are part of the Business Insights team under the Strategy vertical in Worldwide Operations Security, focusing on data analytics to support security and loss prevention initiatives. Our team collaborates across global operations to develop innovative solutions that protect Amazons assets and contribute to business profitability. We leverage technology to identify patterns, prevent losses, and strengthen our operational network. 2+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with one or more industry analytics visualization tools (e.g. Excel, Tableau, QuickSight, MicroStrategy, PowerBI) and statistical methods (e.g. t-test, Chi-squared) Experience with scripting language (e.g., Python, Java, or R) Masters degree, or Advanced technical degree Knowledge of data modeling and data pipeline design Experience with statistical analysis, co-relation analysis

Posted 3 days ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Software Engineer - PLM (Centric NGPLM Platform) Company Description We are looking for you who is immediate joiner and want to grow with us! Job Description: We are looking for a skilled Data Architect to join our PLM Centric NGPLM Platform team. The ideal candidate will play a critical role in shaping and maintaining data architecture standards, focusing on efficient data modeling, integration, and exchange across the enterprise PLM ecosystem. Collaborate with cross-functional teams to define and implement data architecture guidelines, including data modeling, access, and integration contract design. Lead the design and execution of data exchanges between PLM and other enterprise systems. Partner with Value Stream stakeholders and leadership to understand business needs and deliver effective data solutions. Work closely with the Centric Hub team to define and document data contracts and perform data mapping. Execute large-scale data migration using ETL techniques. Ensure seamless integration and alignment of data flows with business processes and systems. Required Skills & Qualifications: 5 to 8 years of strong knowledge of Retail PLM systems and data models, preferably Centric PLM . Proven experience in ETL processes and tools for large-scale data migration. Proficiency in working with JSON and XML data formats. Solid understanding of data architecture principles , including data access, transformation, and contract management. Excellent analytical, communication, and stakeholder management skills. Preferred Qualifications: Experience in PLM-Centric integrations or similar enterprise platform ecosystems. Familiarity with agile methodologies and DevOps principles in data projects. Start: Immediate Location: Bangalore Form of employment: Full-time until further notice, we apply 6 months probationary employment. We interview candidates on an ongoing basis, do not wait to submit your application.

Posted 3 days ago

Apply

8.0 - 13.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Job Type: Fulltime Location: Hyderabad / Work from office Experience: 8+ Years No of positions: 1 Job Description: We are looking for an experienced ETL Lead to manage and deliver enterprise-grade data integration solutions using Azure Data Factory (ADF), SSIS, SQL Querying, Azure SQL, Azure Data Lake, and preferably Azure Databricks. The role includes leading a team, building scalable ETL pipelines, and ensuring data quality and performance through efficient CI/CD practices. Key Responsibilities: Lead a team of engineers and manage ETL project lifecycles. Design, develop, and optimize ETL workflows using ADF and SSIS. Write complex SQL queries and perform performance tuning. Integrate data from varied sources into Azure SQL and Data Lake. Implement CI/CD pipelines for automated deployment and testing. Collaborate with stakeholders to translate business needs into technical solutions. Maintain documentation and enforce best practices. Requirements: 8+ years in ETL development and data integration. Strong expertise in ADF, SSIS, SQL Querying, Azure SQL, Azure Data Lake. Experience with CI/CD tools (e.g., Azure DevOps, Git). Exposure to Azure Databricks is a plus. Solid understanding of data warehousing and data modeling.

Posted 3 days ago

Apply

2.0 - 4.0 years

9 - 10 Lacs

Gurugram

Work from Office

Whats the roleThe Reporting Analyst role must build in-depth understanding of the businesses, strategy & priorities across many dimensions to provide analysis and insights, identify key Value drivers and highlight the existing value gaps with points of improvement. It has a lot of opportunity to directly impact the performance of the various business function. As a Reporting Analyst, you will work in a global, multicultural, collaborative, fun, and agile work environment; you will be responsible for leading and managing internal and external front-end development tools and work on state-of the art reporting and dashboarding solutions. Who is HiltiAt Hilti, we are a passionate global team committed to making construction better. As a trusted partner for productivity, safety, and sustainability, we provide our customers innovative solutions that impact the buildings, roads, and infrastructure people rely on every day. Hilti is where individuals grow lasting careers by exploring possibilities, maximizing their potential, owning their development, and making a real difference every day. What does the role involveDefine business requirements and recommend solutions in the form of features/epics, user stories, and other document artifacts. On time update of data product in production with 100% accuracy, Data source identifications and management using SAP BW4HANA, Data Lake (SQL Server) Development of dashboard using SAC, Power BI, Excel Perform data source mapping. Ensure the expected flow of data from back end to front-end in cooperation with other Team members. Provide updates on data products developments (current state, planned & phased out products) Recommend on desirability, viability, and feasibility of requested data products to support decisions driving reporting/dashboarding requirements. Participates in global and region information management meetings/calls to align on priorities, development design and improvements. Ask the right questions to drive innovation, simplification and reduce complexity. Manages release cycles and life cycle management of his/her data product range. Coordinate delivery of global and regional user stories/dashboards and produces regular and ad-hoc reporting. Known for our focus on providing fulfilling careers and a culture of Performance with Care, we are Ranked 16th amongst India s Best Workplaces and 17th Among Asia s Best Workplaces by Great Place to Work Institute . Watch these videos to know more: Celebrating 25 years of Hilti India in style - https://youtu.be/oR4WFxYDsKQ Hear what our employees have to say on Hilti Indias legacy | #25YearsOfHilti - https://youtu.be/8k8qg8JoUaw Hilti India A great place to work for Women - https://youtu.be/gq3uliJy3c0 What do we offerYour responsibilities will be great and, with them, we ll give you the freedom and autonomy to do whatever it takes to deliver outstanding results. We ll offer you opportunities to move around the business to work abroad, experience different job functions and tackle different markets. It s a great way to find the right match for your ambitions and achieve the exciting career you re after. We have a very thorough people review process which enables your career progression as soon as you re ready for the next challenge. What you need is: Bachelor/master s degree in computer science, information systems, Business analytics 2-4 years of on job experience in Reporting & Analytics Solid work experience with SQL Server (Procedures, Function, Views, SQL Agents) Experience in data extraction, data transformation, data load and data quality management. (DDL, DCL, DML, DQL etc.) Experience in one of the BI Software s such as Power BI, SAP Analytics Cloud Work experience with SAP BW, BOA, Excel, and VBA Solid computer skills, including Microsoft Office, databases. Solid experience in Database management and data modeling. Creating database schemas that represent and support business processes. Preferred experience in SAP Analytics Application Why should you apply

Posted 3 days ago

Apply

5.0 - 7.0 years

16 - 18 Lacs

Chennai

Work from Office

Be a key player in Fords exciting journey to reshape its product and program data! As a Program and Product Structure Transformation Specialist, you will contribute to a company-wide initiative to redefine our product and program data landscape. Working closely with the Transformation Manager and cross-functional teams, you will analyze existing data, develop and implement data transformation processes, and ensure data quality. This role offers a significant opportunity to leverage your data expertise and drive impactful change within a large-scale project. 5-7 years of experience in data analysis, data management, or a related field, with a focus on data transformation. Strong understanding of data elements, data types, and data relationships. Excellent analytical and problem-solving skills, with the ability to analyze complex data sets and identify patterns. Ability to think logically and systematically about data transformation processes and develop efficient solutions. Bachelors degree in a related field (e.g., Computer Science, Data Science, Engineering, Business). Excellent written and verbal communication skills, with the ability to present technical information to both technical and non-technical audiences. Ability to work effectively in a team environment and collaborate with diverse stakeholders. Proficiency in Microsoft Excel (including advanced data manipulation and analysis functions). Strong proficiency in data query languages (e.g., SQL). Experience with data modeling tools. Experience in an engineering or manufacturing environment is highly desirable. Familiarity with PLM systems and product development processes is a plus. Experience with Microsoft Teams, SharePoint, and OneNote. Bonus Points: Experience with data transformation tools and ETL processes. Proficiency in programming languages (e.g., Python) for data analysis and automation. Experience with data visualization tools (e.g., PowerBI, Tableau). Knowledge of data quality principles and practices and experience implementing data quality initiatives. Experience with Agile development methodologies. Familiarity with Product & BOM (EBOM, MBOM & BOP). Analyze existing product and program data structures to understand their content, quality, and relationships. Decompose complex data structures into foundational elements (entities, attributes, relationships). Identify value-added vs. non-value-added data elements and propose strategies for data optimization Collaborate with cross-functional teams to understand the desired future state product and program data structures. Map current state data elements to future state elements, identifying gaps and potential transformation challenges. Develop and document data mapping specifications. Develop transformation logic and rules for converting data from the current state to the future state, ensuring data integrity and consistency. Create and maintain comprehensive documentation for data transformation processes. Develop, test, and execute data transformation scripts and processes. Validate and cleanse transformed data to ensure accuracy and completeness. Identify and troubleshoot data quality issues during the transformation process, implementing corrective actions. Work closely with cross-functional teams, including engineering, manufacturing, IT, and business stakeholders. Communicate effectively with team members and stakeholders to share progress, identify risks, and resolve issues. Participate in meetings and workshops to contribute to the overall transformation strategy and provide data-driven insights.

Posted 3 days ago

Apply

2.0 - 7.0 years

4 - 8 Lacs

Bengaluru

Work from Office

The candidate should have strong experience in data extraction, transformation, cleansing, validation, and bulk loading into the Reltio MDM platform Proficiency in SQL and ETL scripting (such as Pyth Reltio Data Migration/Data Quality Engineer Globalsoft, Inc The candidate should have strong experience in data extraction, transformation, cleansing, validation, and bulk loading into the Reltio MDM platform Proficiency in SQL and ETL scripting (such as Python or TLEd) is required, along with familiarity with the Reltio Data Model (Level 3), JSON formats, and Reltio Loader The role requires expertise in CLI- and API-based ingestion processes, Reltio Bulk API processing, Reltio Logical Configuration Application (LCA), and data quality rule implementation Hands-on experience with ETL development using Java or Python is also essential

Posted 3 days ago

Apply

2.0 - 4.0 years

3 - 7 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Overview Stats Perform is the market leader in sports tech. We provide the most trusted sports data to some of the worlds biggest organizations, across sports, media, and broadcasting. Through the latest AI technologies and machine learning, we combine decades worth of data with the latest in-game happenings. We then offer coaches, teams, professional bodies, and media channels around the world, access to the very best data, content, and insights. In turn, improving how sports fans interact with their favorite sports teams and competitions. How do they use it? Media outlets add a little magic to their coverage with our stats and graphics packages. Sportsbooks can offer better predictions and more accurate odds. The worlds top coaches are known to use our data to make critical team decisions. Sports commentators can engage with fans on a deeper level, using our stories and insights. Anywhere you find sport, Stats Perform is there. However, data and tech are only half of the package. We need great people to fuel the engine. We succeeded thanks to a team of amazing people. They spend their days collecting, analyzing, and interpreting data from a wide range of live sporting events. If you combine this real-time data with our 40-year-old archives, elite journalists, camera operators, copywriters, the latest in AI wizardry, and a host of behind the scenes support staff, youve got all the ingredients to make it a magical experience! Responsibilities: We are seeking a highly analytical and detail-oriented Business Analyst to join our team. This role is crucial in transforming raw data into actionable insights, primarily through the development of interactive dashboards and comprehensive data analysis. The successful candidate will bridge the gap between business needs and technical solutions, enabling data-driven decision-making across the organization. Key Responsibilities: Requirements Gathering: Collaborate with stakeholders across various departments to understand their data needs, business challenges, and reporting requirements. Data Analysis: Perform in-depth data analysis to identify trends, patterns, and anomalies, providing clear and concise insights to support strategic initiatives. Dashboard Development: Design, develop, and maintain interactive and user-friendly dashboards using leading data visualization tools (e.g., Tableau, Power BI) to present key performance indicators (KPIs) and business metrics. Data Modeling & Querying: Utilize SQL to extract, transform, and load data from various sources, ensuring data accuracy and integrity for reporting and analysis. Reporting & Presentation: Prepare and deliver compelling reports and presentations of findings and recommendations to both technical and non-technical audiences. Data Quality: Work closely with IT and data teams to ensure data quality, consistency, and accessibility. Continuous Improvement: Proactively identify opportunities for process improvements, data efficiency, and enhanced reporting capabilities. Stakeholder Management: Build strong relationships with business users, understanding their evolving needs and providing ongoing support for data-related queries. Desired Qualifications: Education: Bachelors degree in Business, Finance, Economics, Computer Science, Information Systems, or a related quantitative field. Experience: Proven experience (typically 3+ years) as a Business Analyst, Data Analyst, or similar role with a strong focus on data analysis and dashboarding. Data Visualization Tools: Proficiency in at least one major data visualization tool (e.g., Tableau, Microsoft Power BI, Looker). SQL: Strong proficiency in SQL for data extraction, manipulation, and analysis from relational databases. Data Analysis: Excellent analytical and problem-solving skills with the ability to interpret complex datasets and translate them into actionable business insights. Communication: Exceptional written and verbal communication skills, with the ability to explain technical concepts to non-technical stakeholders. Business Acumen: Solid understanding of business processes and key performance indicators. Attention to Detail: Meticulous attention to detail and a commitment to data accuracy. Nice-to-Have: Experience with statistical programming languages (e.g., Python with Pandas/NumPy) for advanced data manipulation and analysis. Familiarity with data warehousing concepts and cloud data platforms (e.g., Snowflake, AWS Redshift, Google BigQuery). Experience with advanced Excel functions (e.g., Power Query, Power Pivot). Certification in relevant data visualization tools. Why work at Stats Perform? We love sports, but we love diverse thinking more! We know that diversity brings creativity, so we invite people from all backgrounds to join us. At Stats Perform you can make a difference, by using your skills and experience every day, youll feel valued and respected for your contribution. We take care of our colleagues We like happy and healthy colleagues. You will benefit from things like Mental Health Days Off, No Meeting Fridays, and flexible working schedules. We pull together to build a better workplace and world for all. We encourage employees to take part in charitable activities, utilize their 2 days of Volunteering Time Off, support our environmental efforts, and be actively involved in Employee Resource Groups. Diversity, Equity, and Inclusion at Stats Perform By joining Stats Perform, youll be part of a team that celebrates diversity. A team that is dedicated to creating an inclusive atmosphere where everyone feels valued and welcome. All employees are collectively responsible for developing and maintaining an inclusive environment. That is why our Diversity, Equity, and Inclusion goals underpin our core values. With increased diversity comes increased innovation and creativity. Ensuring were best placed to serve our clients and communities. Stats Perform is committed to seeking diversity, equity, and inclusion in all we do. With increased diversity comes increased innovation and creativity. Ensuring were best placed to serve our clients and communities. Stats Perform is committed to seeking diversity, equity, and inclusion in all we do.

Posted 3 days ago

Apply

7.0 - 9.0 years

32 - 37 Lacs

Bengaluru

Work from Office

Our Finance Velocity Office was designed to accelerate our path to building a world-class finance function. This function is shaping our transformation strategy, improving our business operations and enhancing the impact we make as a function. Why velocity? Velocity is about both speed and strategy, focused on accelerating in a given direction. We will hone our approach, think big and decide quickly, living our Leadership Principles. This dedicated team is helping to drive global consistency and operating as one team to build a foundation that supports the growth and complexity of our business and improves the day-to-day interactions of our Finance teams. We are seeking a highly skilled and detail-oriented Finance Systems / Program Manager to lead a developer team and coordinate the end-to-end delivery of robust, scalable, and insightful dashboards and reporting systems. This role will be part of our FVO - Finance Business Intelligence team and will report to the Director of Management Reporting. This is a People Manager role that will lead a team to deliver and maintain a growing portfolio of reporting and BI requests, requires a strong technical background, data visualization and design experience (especially Power BI), and project/program management skills. The role will collaborate with finance stakeholders to meet their reporting needs, and also work closely with Corporate IT and Data Strategy - Governance team on data enablement, pipelines, and standardization. Note, this is a global role and requires effective collaboration with stakeholders across multiple time zones. Strong time management skills and flexibility are essential. Essential Functions Proven experience leading and mentoring a team Facilitate sprint planning, backlog grooming, daily stand-up meetings, reviews/demos, retrospectives, and other scrum ceremonies Provide guidance on dashboard design, data modeling, and performance optimization in Power BI. Review deliverables for accuracy, scalability, and performance. Collaborate with Corporate IT to ensure proper data pipeline integration and governance. Coach team members through the product development life cycle using Agile, Scrum, and Lean practices and work cross-functionally throughout the company to ensure projects are developed and deployed with quality and timely delivery into our production systems Proactively identify, manage and mitigate project risks and find ways to accomplish project goals in the context of dynamic business/technical environments Accountable for the end-to-end planning, execution, and documentation of multiple projects. Lead planning for scope, schedule, risks, deployments and communications for projects and initiatives Set clear expectations with business partners to provide transparency into project initiation activities, and manage day-to-day interaction during project execution and delivery Achieve project goals by engaging effectively with stakeholders, including enterprise architects and strategists, technical subject matter experts, business partners, Visa senior management, and technology vendors, as needed. Proven ability to collaborate and build relationships with cross-functional teams, including business leaders, developers, QA and other technical teams. Maintain effective data operations using tools such as Python, PowerBI, or other BI platforms Strong ability to manage multiple priorities in a fast-paced environment while maintaining a high level of accuracy and attention to detail. Basic Qualifications 7+ or more years of relevant work experience with a bachelor s degree Experience in people management for 2+yrs Proven experience leading and mentoring a team Strong attention to detail and commitmen

Posted 3 days ago

Apply

8.0 - 10.0 years

11 - 15 Lacs

Mumbai

Work from Office

Employment Type: Permanent Availability: 15 to 30 Days Remuneration Package: Market Standard Job Description: We are seeking an experienced SAP BW (Business Warehouse) Consultant with 8 to 10 years of experience in SAP. The ideal candidate will have a strong background in SAP BW/4 Hana and SAP Analytics Cloud (SAC). Responsibilities: 1. Lead and participate in SAP BW implementation projects, focusing on BW/4 Hana migration and integration with other SAP modules. 2. Design, develop, and maintain SAP BW data models, extractors, transformations, and reports to meet business requirements. 3. Collaborate with stakeholders to gather and analyze business requirements and translate them into technical solutions within the SAP BW framework. 4. Provide expertise in SAP BW/4 Hana implementation, ensuring smooth migration and alignment with organizational objectives. 5. Develop and maintain SAP Analytics Cloud (SAC) reports and dashboards for data visualization and business intelligence. 6. Conduct end-user training sessions and provide ongoing support for SAP BW and SAC functionalities. 7. Collaborate with cross-functional teams to ensure seamless integration with other SAP modules and external systems. 8. Stay updated with SAP BW/4 Hana and SAC best practices, new features, and industry trends. Requirements 1. Bachelors degree in Computer Science, Information Technology, or a related field. 2. 8 to 10 years of experience working with SAP BW. 3. Strong expertise in SAP BW/4 Hana, including data modeling, extraction, transformation, and reporting. 4. Experience with SAP Analytics Cloud (SAC) for data visualization and business intelligence. 5. Effective communication and interpersonal skills, with the ability to interact with stakeholders at all levels. 6. Ability to work independently and collaboratively in a team environment.

Posted 3 days ago

Apply

1.0 - 10.0 years

3 - 12 Lacs

Virudhunagar

Work from Office

Data Engineer Trainer Virudhunagar, Kovilpatti Job Type Full Time / Part Time Experience: 1 10 years Salary: 1.5 LPA 4 LPA Skillsets: Programming Languages: Python, SQL, Scala / Java Data Modeling & Database Systems: Relational Databases, NoSQL Databases, Data Warehouses ETL / ELT Development: PowerBI, Apache Airflow, dbt, Luigi, Talend Big Data Technologies: Hadoop Ecosystem, Apache Spark, Kafka, Hive / Presto / Trino Cloud Platforms : AWS, GCP, Azure Non - IT Job Openings We re looking for a detail-oriented Accountant to manage day-to-day financial operations for our Getin Technologies businesses. You ll ensure accurate bookkeeping, timely compliance, and insightful reporting to help drive our growth. Key Responsibilities Bookkeeping & Ledger Maintenance Record all financial transactions (sales, purchases, receipts, payments) in Zoho Books. Maintain and reconcile general ledger, sub-ledgers (payables, receivables) and bank statements Month-End & Year-End Close Prepare trial balance, P&L, balance sheet and cash flow statements Accrue expenses, prepayments, depreciation schedules Tax & Compliance Calculate and file GST returns (GSTR-1, GSTR-3B) and other indirect tax returns Assist with TDS/TCS calculations and quarterly filings Support preparation and filing of income-tax returns (ITR-3/4) for the firms and proprietors Payroll & Statutory Deductions Run monthly payroll, calculate PF/ESI and ensure timely remittance Maintain employee attendance and leave records Budgeting & Forecasting Assist in annual budgeting process and monthly variance analysis Provide cash-flow projections and working-capital reporting Audit & Controls Coordinate statutory and tax audits, prepare schedules and audit queries Implement and monitor internal financial controls and policies Qualifications & Experience: Bachelor s degree in Commerce (B.Com) or equivalent; CA Inter / ICWA Inter or CMA preferred Minimum 1 2 years hands-on experience in accounting for service-oriented businesses Proven track record with Zoho Books or similar ERP Sound knowledge of GST, TDS/TCS, Income-tax Act provisions and statutory compliance Core Skills & Competencies: Technical : Advanced MS Excel (VLOOKUP, pivot tables) Familiarity with e-filing portals (GST, Income-tax, MCA) Analytical & Detail-Oriented : Strong numerical accuracy and ability to spot discrepancies Aptitude for for Accountant [Your Name] .

Posted 3 days ago

Apply

3.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Position: Informatica, SQL, Unix Developer Location: Bangalore Permanent role Job description: Experience 3-5 years Informatica PowerCenter Developer designs, develops, and maintains ETL (Extract, Transform, Load) workflows using Informatica PowerCenter. Collaborate with business analysts and data architects to understand data requirements and implement effective data integration solutions. This includes data mapping, transformation, and loading from various sources into data warehouses or other target systems. Optimize ETL workflows for performance improvement and scalability. Monitor and troubleshoot ETL workflows, identify and resolve issues, and ensure data integrity and availability. Participate in data modeling discussions to support data warehousing initiatives. Create and maintain technical documentation for ETL processes, data mappings, and workflows. Conduct unit testing, integration testing, and performance testing to ensure the quality of ETL solutions Experience with Shell and Python scripting Good to have Tableau exposure Experience: 3 5 years Education: B.Tech(CS/IT) or equivalent Informatica, Python, Shell, Sql, Unix

Posted 3 days ago

Apply

1.0 - 2.0 years

17 - 19 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Overview of Role As a Data Engineer specializing in AI/ML, youll be instrumental in designing, building, and maintaining the data infrastructure crucial for training, deploying, and serving our advanced AI and Machine Learning models. Youll work closely with Data Scientists, ML Engineers, and Cloud Architects to ensure data is accessible, reliable, and optimized for high-performance AI/ML workloads, primarily within the Google Cloud ecosystem. Responsibilities Data Pipeline Development: Design, build, and maintain robust, scalable, and efficient ETL/ELT data pipelines to ingest, transform, and load data from various sources into data lakes and data warehouses, specifically optimized for AI/ML consumption. AI/ML Data Infrastructure: Architect and implement the underlying data infrastructure required for machine learning model training, serving, and monitoring within GCP environments. Google Cloud Ecosystem: Leverage a broad range of Google Cloud Platform (GCP) data services including, BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, Vertex AI, Composer (Airflow), and Cloud SQL. Data Quality & Governance: Implement best practices for data quality, data governance, data lineage, and data security to ensure the reliability and integrity of AI/ML datasets. Performance Optimization: Optimize data pipelines and storage solutions for performance, cost-efficiency, and scalability, particularly for large-scale AI/ML data processing. Collaboration with AI/ML Teams: Work closely with Data Scientists and ML Engineers to understand their data needs, prepare datasets for model training, and assist in deploying models into production. Automation & MLOps Support: Contribute to the automation of data pipelines and support MLOps initiatives, ensuring seamless integration from data ingestion to model deployment and monitoring. Troubleshooting & Support: Troubleshoot and resolve data-related issues within the AI/ML ecosystem, ensuring data availability and pipeline health. Documentation: Create and maintain comprehensive documentation for data architectures, pipelines, and data models. Qualifications: 1-2+ years of experience in Data Engineering, with at least 2-3 years directly focused on building data pipelines for AI/ML workloads. Deep, hands-on experience with core GCP data services such as BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, and Composer/Airflow. Strong proficiency in at least one relevant programming language for data engineering (Python is highly preferred).SQL skills for complex data manipulation, querying, and optimization. Solid understanding of data warehousing concepts, data modeling (dimensional, 3NF), and schema design for analytical and AI/ML purposes. Proven experience designing, building, and optimizing large-scale ETL/ELT processes. Familiarity with big data processing frameworks (e.g., Apache Spark, Hadoop) and concepts. Exceptional analytical and problem-solving skills, with the ability to design solutions for complex data challenges. Excellent verbal and written communication skills, capable of explaining complex technical concepts to both technical and non-technical stakeholders.

Posted 3 days ago

Apply

2.0 - 7.0 years

6 - 7 Lacs

Bengaluru

Work from Office

Draup - Multi-Dimensional Global Labor & Market Data Data Analyst - Tech Job Summary We are looking for a highly skilled Big Data & ETL Tester to join our data engineering and analytics team. The ideal candidate will have strong experience in PySpark , SQL , and Python , with a deep understanding of ETL pipelines , data validation , and cloud-based testing on AWS . Familiarity with data visualization tools like Apache Superset or Power BI is a strong plus. You will work closely with our data engineering team to ensure data availability, consistency, and quality across complex data pipelines, and help transform business requirements into robust data testing frameworks. Key Responsibilities Collaborate with big data engineers to validate data pipelines and ensure data integrity across ingestion, processing, and transformation stages. Write complex PySpark and SQL queries to test and validate large-scale datasets. Perform ETL testing , covering schema validation, data completeness, accuracy, transformation logic, and performance testing. Conduct root cause analysis of data issues using structured debugging approaches. Build automated test scripts in Python for regression, smoke, and end-to-end data testing. Analyze large datasets to track KPIs and performance metrics supporting business operations and strategic decisions. Work with data analysts and business teams to translate business needs into testable data validation frameworks. Communicate testing results, insights, and data gaps via reports or dashboards (Superset/Power BI preferred). Identify and document areas of improvement in data processes and advocate for automation opportunities. Maintain detailed documentation of test plans, test cases, results, and associated dashboards. Required Skills and Qualifications 2+ years of experience in big data testing and ETL testing . Strong hands-on skills in PySpark , SQL , and Python . Solid experience working with cloud platforms , especially AWS (S3, EMR, Glue, Lambda, Athena, etc.) . Familiarity with data warehouse and lakehouse architectures. Working knowledge of Apache Superset , Power BI , or similar visualization tools. Ability to analyze large, complex datasets and provide actionable insights. Strong understanding of data modeling concepts, data governance, and quality frameworks. Experience with automation frameworks and CI/CD for data validation is a plus. Preferred Qualifications Experience with Airflow , dbt , or other data orchestration tools . Familiarity with data cataloging tools (e.g., AWS Glue Data Catalog). Prior experience in a product or SaaS-based company with high data volume environments. Why Join Us? Opportunity to work with cutting-edge data stack in a fast-paced environment. Collaborate with passionate data professionals driving real business impact. Flexible work environment with a focus on learning and innovation. EAIGG Draup is a Member of the Ethical AI Governance Group As an AI-first company, Draup has been a champion of ethical and responsible AI since day one. Our models adhere to the strictest data standards and are routinely audited for bias. Ready to see results? Drive better decisions with unmatched, real-time data & agentic intelligence Thank you! Your submission has been received! Oops! Something went wrong while submitting the form. Strictly necessary (always active) Cookies required to enable basic website functionality. Cookies used to deliver advertising that is more relevant to you and your interests. Cookies allowing the website to remember choices you make (such as your user name, language, or the region you are in). Cookies helping understand how this website performs, how visitors interact with the site, and whether there may be technical issues.

Posted 3 days ago

Apply

4.0 - 7.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . Responsibilities 4 7 years of experience in Data Engineering or related roles. Handson experience in Microsoft Fabric Handson experience in Azure Databricks Proficiency in PySpark for data processing and scripting. Strong command over Python & SQL writing complex queries, performance tuning, etc. Experience working with Azure Data Lake Storage and Data Warehouse concepts (e.g., dimensional modeling, star/snowflake schemas). Hands on experience in performance tuning & optimization on Databricks & MS Fabric. Ensure alignment with overall system architecture and data flow. Understanding CI/CD practices in a data engineering context. Excellent problemsolving and communication skills. Exposure to BI tools like Power BI, Tableau, or Looker. Good to Have Experienced in Azure DevOps. Familiarity with data security and compliance in the cloud. Experience with different databases like Synapse, SQL DB, Snowflake etc. Mandatory skill sets Microsoft Fabric, Azure (Databricks & ADF), PySpark Preferred skill sets Microsoft Fabric, Azure (Databricks & ADF), PySpark Years of experience required 410 Education qualification Btech/MBA/MCA Education Degrees/Field of Study required Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred Required Skills Microsoft Azure Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Travel Requirements Government Clearance Required?

Posted 3 days ago

Apply

2.0 - 7.0 years

50 - 55 Lacs

Bengaluru

Work from Office

The Amazon Alexa AI team in India is seeking a talented, self-driven Applied Scientist to work on prototyping, optimizing, and deploying ML algorithms within the realm of Generative AI. Key responsibilities include: Research, experiment and build Proof Of Concepts advancing the state of the art in AI & ML for GenAI. Collaborate with cross-functional teams to architect and execute technically rigorous AI projects. Thrive in dynamic environments, adapting quickly to evolving technical requirements and deadlines. Engage in effective technical communication (written & spoken) with coordination across teams. Conduct thorough documentation of algorithms, methodologies, and findings for transparency and reproducibility. Publish research papers in internal and external venues of repute Support on-call activities for critical issues Basic Qualifications: Master s or PhD in computer science, statistics or a related field 2-7 years experience in deep learning, machine learning, and data science. Proficiency in coding and software development, with a strong focus on machine learning frameworks. Experience in Python, or another language; command line usage; familiarity with Linux and AWS ecosystems. Understanding of relevant statistical measures such as confidence intervals, significance of error measurements, development and evaluation data sets, etc. Excellent communication skills (written & spoken) and ability to collaborate effectively in a distributed, cross-functional team setting. Papers published in AI/ML venues of repute Preferred Qualifications: Track record of diving into data to discover hidden patterns and conducting error/deviation analysis Ability to develop experimental and analytic plans for data modeling processes, use of strong baselines, ability to accurately determine cause and effect relations The motivation to achieve results in a fast-paced environment. Exceptional level of organization and strong attention to detail Comfortable working in a fast paced, highly collaborative, dynamic work environment 3+ years of building models for business application experience Experience in patents or publications at top-tier peer-reviewed conferences or journals Experience programming in Java, C++, Python or related language Experience in any of the following areas: algorithms and data structures, parsing, numerical optimization, data mining, parallel and distributed computing, high-performance computing Knowledge of standard speech and machine learning techniques Experience using Unix/Linux Experience in professional software development

Posted 3 days ago

Apply

4.0 - 7.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Seeking a highly skilled and analytical Data Scientist / Machine Learning Engineer to join our team. The ideal candidate will have a strong foundation in data science , machine learning , and statistical analysis , with hands-on experience in building scalable models and working with large datasets in cloud-based environments. Key Responsibilities: Design, develop, and deploy machine learning models for real-world applications using Python , scikit-learn , TensorFlow , or PyTorch . Perform data cleaning , wrangling , transformation , and exploratory data analysis (EDA) using pandas and other Python libraries. Apply statistical techniques such as hypothesis testing , regression analysis , ANOVA , and probability theory to derive insights. Implement supervised and unsupervised learning algorithms , conduct feature engineering , model evaluation , and hyperparameter tuning . Develop data visualizations using Matplotlib , Seaborn , or ggplot to communicate findings effectively. Work with big data technologies such as Apache Hadoop , Apache Spark , or distributed databases for large-scale data processing. Write optimized SQL queries and manage relational databases for data extraction and transformation. Contribute to data engineering pipelines , including data integration , warehousing , and architecture design . Utilize cloud platforms like Microsoft Azure , including Azure Machine Learning , for scalable model deployment. Collaborate using version control systems like Git for code management and team coordination. (Optional/Desirable) Experience with YOLO or other deep learning models for object detection. Qualifications: Bachelors or Masters degree in Computer Science , Data Science , Statistics , Engineering , or a related field. 48 years of experience in data science , machine learning , or AI development . Strong programming skills in Python and experience with data science libraries . Solid understanding of statistical methods , machine learning algorithms , and data modeling . Experience with cloud computing , big data tools , and database management . Preferred Skills: Experience with YOLO , OpenCV , or computer vision frameworks. Familiarity with CI/CD pipelines for ML model deployment. Knowledge of MLOps practices and tools. Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders.

Posted 3 days ago

Apply

5.0 - 10.0 years

3 - 7 Lacs

Hyderabad

Work from Office

What youll do Following are high level responsibilities that you will play but not limited to: Analyze Business Requirements. Analyze the Data Model and do GAP analysis with Business Requirements and Power BI. Design and Model Power BI schema. Transformation of Data in Power BI/SQL/ETL Tool. Create DAX Formula, Reports, and Dashboards. Able to write DAX formulas. Experience writing SQL Queries and stored procedures. Design effective Power BI solutions based on business requirements. Manage a team of Power BI developers and guide their work. Integrate data from various sources into Power BI for analysis. Optimize performance of reports and dashboards for smooth usage. Collaborate with stakeholders to align Power BI projects with goals. Knowledge of Data Warehousing(must), Data Engineering is a plus What youll bring B. Tech computer science or equivalent Minimum 5+ years of relevant experience Job Category: IT Job Type: Full Time Job Location: Hyderabad

Posted 3 days ago

Apply

6.0 - 10.0 years

27 - 42 Lacs

Pune

Work from Office

Job Summary We are seeking a highly skilled Sr. Developer with 6 to 10 years of experience in MES L4/L3 & L2 Integration OSIsoft PI Historian and AVEVA PI Historian. The ideal candidate will work in a hybrid model during day shifts. This role does not require travel. Experience in Data Models is a plus. Responsibilities Develop and maintain MES L4/L3 & L2 Integration solutions to ensure seamless data flow across systems. Implement and manage OSIsoft PI Historian and AVEVA PI Historian to optimize data storage and retrieval. Collaborate with cross-functional teams to design and deploy robust data integration frameworks. Provide technical expertise in MES and Historian systems to support ongoing operations and projects. Troubleshoot and resolve issues related to MES and Historian integrations to ensure system reliability. Conduct regular system audits and performance tuning to maintain optimal system performance. Develop and maintain documentation for all integration processes and system configurations. Ensure compliance with industry standards and best practices in all integration activities. Participate in code reviews and provide constructive feedback to peers. Stay updated with the latest industry trends and technologies to continuously improve system integrations. Work closely with stakeholders to understand business requirements and translate them into technical solutions. Provide training and support to end-users to ensure effective utilization of MES and Historian systems. Contribute to the development of data models to enhance data analysis and reporting capabilities. Qualifications Must have extensive experience in MES L4/L3 & L2 Integration. Must have hands-on experience with OSIsoft PI Historian and AVEVA PI Historian. Nice to have experience in Data Models. Must have strong problem-solving and troubleshooting skills. Must have excellent communication and collaboration skills. Must be able to work independently and as part of a team. Must be detail-oriented and able to manage multiple tasks simultaneously. Must have a proactive approach to learning and staying updated with new technologies. Must have experience in developing and maintaining technical documentation. Must have a strong understanding of industry standards and best practices. Must be able to provide technical support and training to end-users. Must be able to translate business requirements into technical solutions. Must have a strong focus on delivering high-quality solutions that meet business needs. Certifications Required Certified MES Developer OSIsoft PI System Infrastructure Specialist AVEVA PI System Certification

Posted 3 days ago

Apply

5.0 - 10.0 years

25 - 40 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Hybrid

Exciting Opportunity with a Top MNC (Full Time) SAP MDG Technical Consultant (Immediate Joiners Preferred) Job Title: SAP MDG Technical Consultant / Techno-functional consultant (Material Master) Location: [Hyderabad, Bengaluru, Chennai, Mumbai, Pune, Delhi NCR, Coimbatore, Kolkata, Nagpur] Experience: [ 4 to 12 Years] Notice Period: Immediate to 30days Job Description: Project Experience: Involved in 2 projects focused on SAP MDG for Material Master management. Technical Expertise: Strong expertise in: Data Modeling, UI Modeling (MDG UI Configuration), and Process Modeling Working with Fiori, MDG API Frameworks, and BRF+ Implementation of handler and feeder classes Data Replication and Data Replication Framework (DRF) Data Import using DIF, SOA Services, ALE Configuration, and Value Mapping Development & Configuration: Proficient in ABAP Object-Oriented Programming, ABAP Web Dynpro, and ABAP Floor Plan Manager (FPM) Experience with MDG Workflows and integration with DQM Implemented BADI enhancements for: Additional field validation, derivation, and defaulting Email notifications for change requests Mass Processing & IDoc Handling: Experience in Material Mass Processing (creation and change) Worked with inbound/outbound IDOCs, including change pointer setup for Material Master If you are interested in this opportunity, Please share your updated CV at kalyan.g@fusionplusinc.com We look forward to connecting with you!

Posted 3 days ago

Apply

8.0 - 13.0 years

15 - 25 Lacs

Noida, Hyderabad, Bengaluru

Hybrid

Ready to build the future with AI? At Genpact, we dont just keep up with technology—we set the pace. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory, our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what’s possible, this is your moment. Genpact (NYSE: G) is anadvanced technology services and solutions company that deliverslastingvalue for leading enterprisesglobally.Through ourdeep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead.Powered by curiosity, courage, and innovation,our teamsimplementdata, technology, and AItocreate tomorrow, today.Get to know us atgenpact.comand onLinkedIn,X,YouTube, andFacebook. Inviting applications for the role of Senior Principal Consultant-Senior Data Engineer Responsibilities Serve as a technical lead and designer. Provide technical thought leadership in the development and continual improvement of service. • Develop and maintain effective working relationships with team members and clients. • Should have hands-on experience designing and developing stored procedures/functions and leveraging it ETL packages/workflows • Should have hands-on experience in designing and developing scripts for custom ETL processes and automation in any programming knowledge i.e. PowerShell. • Design develops and Test ETL processes per business requirements. Qualifications we seek in you! Minimum Qualifications Relevant professional experience with database technologies and ETL tools (SSIS and Azure ADF and Azure Data factory). • hands-on experience on designing and developing scripts for custom ETL processes and automation in Azure data factory, Azure data bricks. • Good knowledge of AZURE Cloud platform services stack - Azure Data Factory, Azure SQL Database, AZURE Data Warehouse, Azure data bricks, Azure Blob Storage, Azure Data Lake Storage, HD Insight, Cosmos DB, etc. • Hands-on experience on designing and developing scripts for custom ETL processes and automation in Azure data factory, Azure data bricks. Why join Genpact? Lead AI-first transformation – Build and scale AI solutions that redefine industries Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career—Gain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best – Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI – Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 3 days ago

Apply

3.0 - 8.0 years

6 - 8 Lacs

Noida

Work from Office

Urgent Hiring - Power BI Analyst || BPO Greetings from Ienergizer!! We are looking for experienced Power BI Analyst to join our Business Analytics team. Interested candidates can Send their CV to vanshika.kakkar@ienergizer.com Or can WhatsApp on 9289640609 Requirements: 3+ Years of experience with Power BI development and reporting Strong hands-on knowledge of Power BI, Power Query, SQL queries and data modelling Proficiency in SQL for data extraction and transformation Exposure to WFM concepts such as forecasting, scheduling, or capacity planning Should have Experience working in a BPO or Call Centre environment Strong analytical thinking and problem-solving skills Key Responsibilities: Design, develop, and maintain Power BI dashboards, reports, and visualizations Perform SQL-based data extraction and ensure seamless integration into Power BI Utilize Python scripting for data cleaning, transformation, and automation Monitor and report on Call Centre KPIs including Voice, Chat, Email & OB Collaborate with cross-functional teams to understand data requirements ~ 6 days working ~Day Shifts ~Work From Office CTC - Upto 8 LPA Location: iEnergizer - Noida Sector 60 Interested candidates can Send their CV to vanshika.kakkar@ienergizer.com Or can WhatsApp on 9289640609 Referrals are welcomed and encouraged Regards, Vanshika Kakkar TL - HR

Posted 3 days ago

Apply

8.0 - 13.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Looking for a skilled professional with 8 to 14 years of experience to join our team as a Manager - Business Intelligence in Bangalore. The ideal candidate will have a strong background in business intelligence and analytics, with excellent problem-solving skills. Roles and Responsibility Develop and implement business intelligence strategies to drive growth and improvement. Analyze complex data sets to identify trends and opportunities for process improvements. Design and maintain databases and data systems to support business intelligence initiatives. Collaborate with cross-functional teams to integrate business intelligence solutions into existing processes. Provide actionable insights and recommendations to senior management. Stay up-to-date with industry trends and emerging technologies in business intelligence. Job Requirements Strong understanding of business intelligence principles and practices. Excellent analytical and problem-solving skills. Ability to work effectively in a fast-paced environment with multiple priorities. Strong communication and interpersonal skills. Experience with data analysis and visualization tools. Strong knowledge of database design and development. Ability to lead and manage high-performing teams. IndustryCRM/ IT Enabled Services/BPO. Company nameOmega Healthcare Management Services Pvt. Ltd.

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies