Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 8.0 years
6 - 10 Lacs
Pune
Work from Office
**Job Title:** ASC - Senior Power BI Developer **Location:** Pune, Maharashtra, India **Experience:** 6 - 8 years **Job Description:** We are seeking a skilled and experienced Senior Power BI Developer to join our dynamic team in Pune. The ideal candidate will have extensive experience in Power BI development with a solid understanding of DAX queries and administration. As a Senior Power BI Developer, you will play a key role in transforming data into actionable insights, and you will be responsible for designing and maintaining interactive dashboards and reports. **Key Responsibilities:** - Design and develop Power BI reports and dashboards that meet business requirements and provide actionable insights. - Create and optimize DAX queries for analytical calculations and data modeling. - Collaborate with stakeholders to gather requirements and translate them into effective BI solutions. - Manage Power BI administration tasks including user access, security, and performance tuning. - Ensure data quality and consistency across reports and dashboards. - Provide technical support and training to team members and end-users. - Stay updated with industry trends and best practices in data visualization and analysis. **Required Skills:** - 6 to 8 years of experience in Power BI development. - Strong proficiency in DAX Queries and data modeling using Power BI. - Experience with Power BI administration and user management. - Excellent problem-solving skills and ability to work independently as well as in a team. - Strong analytical and communication skills. - Familiarity with data warehousing concepts and ETL processes is a plus. **Preferred Qualifications:** - Bachelor’s degree in Computer Science, Information Technology, or a related field. - Certifications in Power BI or relevant technologies would be an advantage. **What We Offer:** - Competitive salary and benefits package. - Opportunities for professional growth and development. - A collaborative work environment with a focus on innovation and creativity. If you are passionate about data and BI technology and meet the above requirements, we would love to hear from you! **How to Apply:** Interested candidates are invited to submit their resume along with a cover letter outlining their experience and suitability for the role to [insert application email or link]. --- *Note: Adjust the last section or contact details as per the actual application process required by your organization.* Roles and Responsibilities Job Description: Senior Power BI Developer We are seeking a highly skilled Senior Power BI Developer with over 6 years of extensive experience in Power BI development and administration to join our dynamic team. The ideal candidate will have a deep understanding of data visualization, business intelligence solutions, and the latest features of Power BI. This role involves working on Microsoft Fabric and requires strong expertise in SQL, with additional experience in Snowflake being a plus. Knowledge of Power Automate and Power Apps, as well as experience in migration projects, will be highly advantageous. Key Responsibilities Design, develop, and maintain advanced Power BI reports and dashboards to support business decision-making. Administer Power BI environments, including workspace management, security, and governance to ensure optimal performance and data integrity. Collaborate with stakeholders to gather and analyze business requirements, translating them into effective BI solutions. Work on Microsoft Fabric to integrate and manage data workflows, ensuring seamless data processing and reporting. Utilize the latest features of Power BI to enhance reporting capabilities and deliver innovative solutions. Write complex SQL queries to extract, transform, and load data from various sources for reporting purposes. Optimize data models and DAX calculations to improve performance and usability of Power BI reports. Participate in data migration projects, ensuring smooth transitions and minimal disruption to business operations, if applicable. Integrate Power Automate and Power Apps to automate workflows and enhance application functionalities, where required. Mentor junior developers and provide technical guidance on Power BI best practices and solutions. Required Skills and Qualifications Bachelor’s degree in Computer Science, Information Technology, or a related field. 6+ years of experience in Power BI development and administration, with a proven track record of delivering high-quality BI solutions. Strong expertise in Microsoft Fabric for data integration and management. Proficient in SQL for data extraction, transformation, and analysis. In-depth knowledge of the latest Power BI features and updates, with hands-on experience in implementing them. Excellent problem-solving skills and the ability to work independently or as part of a team. Strong communication skills to interact with technical and non-technical stakeholders. Preferred Skills Experience working with Snowflake for cloud data warehousing solutions. Knowledge of Power Automate and Power Apps for workflow automation and application development. Prior involvement in migration projects, particularly related to BI tools or data platforms. Familiarity with other BI tools or technologies is a plus
Posted 3 days ago
12.0 - 15.0 years
15 - 25 Lacs
Noida, Chennai
Hybrid
We are looking Project Manager with Data Management. Experience - 12 to 15 Years Notice Period - Immediate to max 30 days Location - Chennai & Noida Experience with Project Management and Data Migration, Data management, ETL, SQL.
Posted 3 days ago
5.0 - 10.0 years
6 - 10 Lacs
Chennai
Remote
We are looking for a highly skilled Senior SQL Developer with strong ETL development experience and a solid background in data analysis . The ideal candidate will play a key role in designing and optimizing data pipelines, developing robust SQL queries, and transforming complex data sets into meaningful business insights. This position requires a combination of technical expertise, problem-solving skills, and a strategic mindset to support data-driven decision-making across the organization. Key Responsibilities: Design, develop, and optimize complex SQL queries, stored procedures, functions, and views for data extraction and reporting. Develop and maintain scalable ETL pipelines using tools such as Informatica, Talend, or custom scripts (Python, etc.). Collaborate with data architects, business analysts, and stakeholders to understand business requirements and deliver reliable data solutions. Analyze large datasets to uncover trends, identify anomalies, and support advanced analytics and reporting initiatives. Ensure data quality and integrity by performing thorough data validation and error handling. Monitor and optimize performance of SQL queries and ETL workflows. Participate in database design, modeling, and data warehouse architecture improvements. Document data flows, data models, and technical specifications. Mentor junior developers and contribute to code reviews and best practices. Required Qualifications: Bachelor's degree in Computer Science, Information Systems, Data Engineering, or a related field. 5+ years of experience in SQL development and ETL processes. Proficiency in writing complex T-SQL (or PL/SQL) queries and performance tuning. Hands-on experience with ETL tools such as nformatica, Talend, or similar. Strong experience in working with relational databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL). Analytical mindset with experience in translating business requirements into data solutions. Experience with data warehousing concepts and dimensional data modeling. Proficient in data visualization and reporting tools such as Power BI or Tableau. Solid understanding of data governance, security, and compliance standards. Preferred: Experience with cloud-based data platforms (Azure Data Factory, AWS Glue, Google Cloud Dataflow). Knowledge of scripting languages like Python or shell scripting. Experience with Agile or DevOps methodologies. Strong understanding of business domains such as finance, healthcare, or e-commerce (if industry-specific) Work Environment: Remote work flexibility. Cross-functional team collaboration with data engineers, BI analysts, and business teams. Opportunities to work on enterprise-level data projects and emerging technologies. Please share your resume to srividyap@hexalytics.com.
Posted 3 days ago
1.0 - 2.0 years
3 - 5 Lacs
Noida, Gurugram, Bengaluru
Work from Office
What you'll do: Build complex solutions for clients using Programing languages, ETL service platform, Cloud, etc. Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements; Apply appropriate development methodologies (e.g.: agile, waterfall) and best practices (e.g.: mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments; Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management; Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management; Bring transparency in driving assigned tasks to completion and report accurate status; Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams; Assist senior team members, delivery leads in project management responsibilities: What you'll bring : Bachelor's degree with specialization in Computer Science, IT or other computer related disciplines with record of academic success; Up to 2 years of relevant consulting industry experience working on small/medium-scale technology solution delivery engagements: Experience in ETL interfacing technologies like Informatica, Talend, SSIS, etc. Experience in data warehousing & SQL Exposure to Cloud Platforms will be a plus - AWS, Azure, GCP. Strong verbal and written communication skills with ability to articulate results and issues to internal and client teams; Proven ability to work creatively and analytically in a problem-solving environment; Ability to work within a virtual global team environment and contribute to the overall timely delivery of multiple projects; Willingness to travel to other global offices as needed to work with client or other internal project teams. Location - Bengaluru,Gurugram,Noida,Pune.
Posted 3 days ago
0.0 - 3.0 years
3 - 8 Lacs
Pune
Work from Office
about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What Youll Do Partner closely with leads (development/product owner/stakeholders/ etc.) to ensure a clear understanding of business and technical needs; assisting in selecting the best strategy and translate them into effective data solutions daily . Design and implement solutions in Spark using industry-leading best practices for life sciences data while ensuring optimal performance, efficient operations, security, and compliance with regulatory standards. Work with in house product teams to optimize the existing data pipelines from effort and cost perspective Identify, test prototype solution and proof of concepts. Supporting testing team with identifying the optimal test strategy. Get reviews on designs to ensure consistency with architectural best practices; participate in regular implementation reviews to ensure consistent quality and adherence with standards. Work in an agile team to implement data quality checks, setup process orchestrations, mitigate vulnerabilities, devise testing strategies, and automate repeatable tasks along with participating and assisting in root cause analysis activities. Ensure technical standards and architectural best practices are consistently followed. Govern complete architecture and implementation of the project module. What Youll Bring 0-3 years of data engineering experience with track record in ETL, data modeling, data warehouse design, development, ingesting data from diverse data sources (databases or APIs) and performing complex transformations using Spark and related technologies. Strong expertise in Python and SQL Hands-on experience developing and optimizing jobs with pyspark apis along with sparksql Hands-on experience on working with Linux operating system Data-driven mindset with high analytical capability with history of leveraging data and analytical techniques to optimize business decisions. Understanding of AWS services like EMR, S3, Glue and Redshift. Good to have exposure with Databricks Good to have exposure to visualization tools like Power BI or Tableu
Posted 3 days ago
1.0 - 2.0 years
2 - 4 Lacs
Noida, Gurugram, Bengaluru
Work from Office
What youll do: Build complex solutions for clients using Programing languages, ETL service platform, Cloud, etc. Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements; Apply appropriate development methodologies (e.g.: agile, waterfall) and best practices (e.g.: mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments; Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management; Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management; Bring transparency in driving assigned tasks to completion and report accurate status; Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams; Assist senior team members, delivery leads in project management responsibilities: What youll bring: Bachelor's degree with specialization in Computer Science, IT or other computer related disciplines with record of academic success; Up to 2 years of relevant consulting industry experience working on small/medium-scale technology solution delivery engagements: Experience in ETL interfacing technologies like Informatica, Talend, SSIS, etc. Experience in data warehousing & SQL Exposure to Cloud Platforms will be a plus - AWS, Azure, GCP. Additional Skills Strong verbal and written communication skills with ability to articulate results and issues to internal and client teams; Proven ability to work creatively and analytically in a problem-solving environment; Ability to work within a virtual global team environment and contribute to the overall timely delivery of multiple projects; Willingness to travel to other global offices as needed to work with client or other internal project teams.
Posted 3 days ago
5.0 - 8.0 years
4 - 7 Lacs
Noida, Gurugram, Bengaluru
Work from Office
What youll do : Strong understanding of data management, data cataloguing, and data governance best practice. Enterprise data integration and management experience working with data management, EDW technologies and data governance solutions. The ideal data governance & data Catalog lead will call on their expertise in master data management (#MDM), data governance, and data quality control to effectively oversee the data elements of a complex product catalog. Showcasing thorough understanding of design and developing data catalog & data assets on industry known leading tool (Open-source catalog tool, Informatica Cloud data catalog, Alation, Collibra or Atlan) that would be the inventory of collective data assets to help data owners, stewards, and business users to discover relevant data for analytics and reporting. Must have experience on Collibra, Data Quality experience, including executing at least 2 large Data Governance, Quality projects from inception to production, working as technology expert. Must have 5+ years of practical experience configuring data governance resources including business glossaries, resources, dashboards, policies, search. Management of Enterprise Glossary through the review of common business terms and definitions and continuous assessments to ensure data adheres to Data Governance Standards, Development and configuration of Collibra/Alation data catalog resources, data lineage, custom resources, custom data lineage, relationships, data domains, data domain groups and composite data domains. Implement Critical Data Elements to govern, corresponding Data Quality rules, policy, regulation, roles, Users, data source systems, dashboard/visualization for multiple data domain. Administration and management of Collibra/Alation data catalogue tool, user groups, permissions Configuration of Data profiling and data lineage Work with Data Owners, stewards, and various stakeholders to understand Collibra/Alation Catalogue requirements and configure it in the tool. What youll bring: Bachelor's or Master's degree in Business Analytics, Computer Science, MIS or related field with academic excellence 3+ years of relevant professional experience in delivering small/medium-scale technology solutions Ability to lead project teams, drive end-to-end activities, meet milestones, and provide mentorship/guidance for the team growth Strong understanding of RDBMS concepts, SQL, data warehousing and reporting Experience with big data concepts, data management, data analytics and cloud platforms Proficiency in programming languages like Python Strong analytical and problem-solving skills, includingexpertise in algorithms and data structures Additional Skills: Strong communication skills, both verbal and written, with the ability to structure thoughts logically during discussions and presentations Capability to simplify complex concepts into easily understandable frameworks and presentations Proficiency in working within a virtual global team environment, contributing to the timely delivery of multiple projects Travel to other offices as required to collaborate with clients and internal project teams Location - Bengaluru,Gurugram,Noida,Pune
Posted 3 days ago
1.0 - 6.0 years
8 - 13 Lacs
Pune
Work from Office
Azure Data Engineer JOB_DESCRIPTION.SHARE.HTML CAROUSEL_PARAGRAPH JOB_DESCRIPTION.SHARE.HTML Pune, India India Enterprise IT - 22756 about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What you'll do: Create and maintain optimal data pipeline architecture. Identify, design, and implement internal process improvements, automating manual processes, optimizing data delivery, re-designing infrastructure for scalability. Design, develop and deploy high volume ETL pipelines to manage complex and near-real time data collection. Develop and optimize SQL queries and stored procedures to meet business requirements. Design, implement, and maintain REST APIs for data interaction between systems. Ensure performance, security, and availability of databases. Handle common database procedures such as upgrade, backup, recovery, migration, etc. Collaborate with other team members and stakeholders. Prepare documentations and specifications. What you'll bring: Bachelors degree in computer science, Information Technology, or related field 1+ years of experience SQL, TSQL, Azure Data Factory or Synapse or relevant ETL technology. Prepare documentations and specifications. Strong analytical skills (impact/risk analysis, root cause analysis, etc.) Proven ability to work in a team environment, creating partnerships across multiple levels. Demonstrated drive for results, with appropriate attention to detail and commitment. Hands-on experience with Azure SQL Database Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At
Posted 3 days ago
6.0 - 10.0 years
8 - 12 Lacs
Gurugram
Work from Office
Join us as a Data Engineer Were looking for someone to build effortless, digital first customer experiences to help simplify our organisation and keep our data safe and secure Day-to-day, youll develop innovative, data-driven solutions through data pipelines, modelling and ETL design while inspiring to be commercially successful through insights If youre ready for a new challenge, and want to bring a competitive edge to your career profile by delivering streaming data ingestions, this could be the role for you We're offering this role at associate vice president level What youll do Your daily responsibilities will include you developing a comprehensive knowledge of our data structures and metrics, advocating for change when needed for product development Youll also provide transformation solutions and carry out complex data extractions, Well expect you to develop a clear understanding of data platform cost levels to build cost-effective and strategic solutions Youll also source new data by using the most appropriate tooling before integrating it into the overall solution to deliver it to our customers, Youll Also Be Responsible For Driving customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tools to build data solutions Participating in the data engineering community to deliver opportunities to support our strategic direction Carrying out complex data engineering tasks to build a scalable data architecture and the transformation of data to make it usable to analysts and data scientists Building advanced automation of data engineering pipelines through the removal of manual stages Leading on the planning and design of complex products and providing guidance to colleagues and the wider team when required The skills youll need To be successful in this role, youll have an understanding of data usage and dependencies with wider teams and the end customer Youll also have experience of extracting value and features from large scale data, Well expect you to have experience of ETL technical design, data quality testing, cleansing and monitoring, data sourcing, exploration and analysis, and data warehousing and data modelling capabilities, Youll Also Need Experience of using programming language such as Python for developing custom operators and sensors in Airflow, improving workflow capabilities and reliability Good knowledge of Kafka and Kinesis for effective real-time data processing, Scala and Spark to enhance data processing efficiency and scalability, Great communication skills with the ability to proactively engage with a range of stakeholders Show
Posted 3 days ago
3.0 - 8.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Lexington Partners is one of the worlds largest and most trusted managers of secondary private equity and co-investment funds. Since our founding in 1994, we have been at the forefront of innovation in private equity investing, managing over $70 billion in committed capital and partnering with a global network of institutional investors, private equity firms, and portfolio companies. What are the ongoing responsibilities of Associate Software Engineer (Data Engineer) responsible for? We are building a growing Data and AI team. You will play a critical role in the efforts to centralize structured and unstructured data for the firm. We seek a candidate with skills in data modeling, data management and data governance, and can contribute first-hand towards firms data strategy. The ideal candidate is a self-starter with a strong technical foundation, a collaborative mindset, and the ability to navigate complex data challenges #ASSOCIATE What ideal qualifications, skills & experience would help someone to be successful? Bachelors degree in computer science or computer applications; or equivalent experience in lieu of degree with 3 years of industry experience. Strong expertise in data modeling and data management concepts. Experience in implementing master data management is preferred. Sound knowledge on Snowflake and data warehousing techniques. Experience in building, optimizing, and maintaining data pipelines and data management frameworks to support business needs. Proficiency in at least one programming language, preferably python. Collaborate with cross-functional teams to translate business needs into scalable data and AI-driven solutions. Take ownership of projects from ideation to production, operating in a startup-like culture within an enterprise environment. Excellent communication, collaboration, and ownership mindset. Foundational Knowledge of API development and integration. Knowledge of Tableau, Alteryx is good-to-have. Work Shift Timings - 2:00 PM - 11:00 PM IST
Posted 3 days ago
4.0 - 9.0 years
15 - 25 Lacs
Bengaluru
Work from Office
Position Title: Senior Data Engineer Department: Development / Engineering Reports To: Team Lead / Tech Lead Work Location: On-site Position Summary The Senior Data Engineer is responsible for designing, building, and maintaining robust data pipelines and data warehouse solutions. This role works closely with data analysts, software engineers, and other stakeholders to ensure data is clean, reliable, and accessible for business insights. The ideal candidate will bring strong ETL development experience, proficiency in SQL and cloud platforms, and the ability to work independently on complex data initiatives. Essential Duties and Responsibilities Key responsibilities include, but are not limited to: Design, develop, and maintain scalable ETL pipelines and data warehousing solutions using tools such as SQL Server, Redshift, and dbt. Write advanced SQL queries and scripts for data extraction, transformation, and loading from various structured and unstructured data sources. Collaborate with cross-functional teams to understand data needs and translate them into scalable and maintainable solutions. Maintain and improve existing data models and data flows; implement changes to optimize performance and reduce redundancy. Ensure data integrity, accuracy, and security throughout the data lifecycle. Work with cloud-based platforms (preferably AWS) to deploy, manage, and monitor data services and pipelines. Develop and maintain technical documentation related to data models, workflows, and systems architecture. Perform data profiling, cleansing, and validation to ensure high-quality and reliable analytics. Contribute to architectural decisions for enterprise data infrastructure and mentor junior data engineers when needed. Stay current with emerging trends and technologies in data engineering and recommend improvements to the existing infrastructure. Supervisory Responsibilities None Qualifications To perform this role successfully, candidates must meet the following requirements: Education & Experience Bachelors degree in Computer Science, Information Technology, Data Science, or a related field (preferred). Minimum of 5 years of hands-on experience as a Data Engineer or in a similar role. Proven expertise in data warehousing concepts and ETL development tools (any). Proficient in SQL Server , AWS Redshift , and dbt for data modeling and transformation. Strong SQL scripting capabilities and intermediate Python programming skills. Exposure to cloud technologies , particularly AWS (e.g., S3, Lambda, Glue, Redshift, etc.). Experience in working with large-scale datasets and optimizing data pipelines for performance and scalability. Strong analytical thinking and problem-solving skills with the ability to work in a fast-paced environment. Language Skills Ability to read and comprehend technical documentation and business requirements. Ability to draft moderately complex documentation and communicate clearly with team members. Strong verbal communication skills for both technical and non-technical audiences. Mathematical Skills Ability to apply mathematical concepts such as fractions, percentages, ratios, and proportions to practical problems. Comfort with data profiling, statistics, and data quality metrics. Reasoning Ability Strong ability to troubleshoot and resolve complex data issues. Ability to understand and solve ambiguous problems through structured thinking and analysis. Ability to handle multiple priorities and deliver quality outcomes within tight deadlines. Computer Skills Proficient in database technologies, SQL coding, and ETL tools. Comfortable using cloud services (AWS preferred), version control systems (Git), and development tools. Familiarity with workflow orchestration tools (e.g., Airflow, Prefect) is a plus. Certificates and Licenses None required (Relevant AWS certifications are a plus) Travel Requirements Minimal travel (up to 5%) may be required for meetings, training, or conferences. Information Security & Privacy Compliance Ensure the secure handling and storage of sensitive data. Implement access control protocols and data encryption standards. Maintain compliance with privacy regulations (e.g., GDPR, HIPAA) and internal data governance policies. Participate in regular security audits and proactively address vulnerabilities in data systems. Position Description Acknowledgment By signing below, I acknowledge that I have reviewed and understood the duties and expectations associated with the Senior Data Engineer role. I understand that responsibilities may evolve over time and agree to communicate with my supervisor for clarification as needed. I also understand that performance evaluations and compensation may be tied to fulfilling these responsibilities. Employees Name (Print): ______________________________________ Employees Signature: ___________________________ Date: _______________ Managers Signature: ___________________________ Date: _______________ Version 1 Effective June 2025
Posted 3 days ago
3.0 - 8.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Procurement Analyst (Strategic Sourcing, Contract Negotiation, Spend Analysis) What is the Enterprise Vendor Management group responsible for? The Enterprise Vendor Management team supports strategic initiatives and firm-wide objectives through the global management of labor and non-labor vendors for the purpose of delivering high quality vendor products and services to grow our investment management business, while optimizing the total cost of ownership. What are the ongoing responsibilities of a Procurement Analyst? You will be a part of the Global team covering a multi hundred-million-dollar Technology Spend Portfolio You will be with the Global Tech & Business teams to understand their requirements for services, products, and platforms You will work on new opportunities, renewals, extensions, buying additional modules, and services You will be responsible for completing the assigned sourcing and contracting requests, while ensuring all records are up to date You will follow up on requests during various stages and provide regular updates to stakeholders in the business, technology, legal, infosec and data privacy teams You will launch and conduct sourcing events and monitor the intake queues for technology products and services requests You will be involved in supplier negotiations (including contracts, commercials, sourcing events) You will review, comment, and redline contract documents (MSAs, SOWS, EULA, Order Forms & Others.) You will partner with Legal to work through conflicting or difficult positions by thinking critically to provide recommendations or alternative options You will manage complex high value sourcing projects from time to time, to ensure optimal pricing and mutually favorable contractual terms, while balancing risk and value You will build partnerships with key stakeholders and leverage relationships to influence strategic sourcing initiatives. What ideal qualifications, skills & experience would help someone to be successful? Minimum of 3+ years of experience in Strategic Sourcing, Procurement, and/or Vendor Management with a focus in the Technology category. (Financial services industry experience is preferred) Minimum of bachelors degree or equivalent experience A reasonable understanding of sourcing and contracting principles & methodologies Ability to perform data & spend analytics Familiarity in reviewing contracts including general contract provisions, redlines, comments, etc. Intermediate excel skills (Pivots, Formulas, Queries, Macros) Familiarity in creating sourcing events, contract requests, and contract workspaces Comfortable in dealing with ambiguity and navigating through situations without clear directions Handle multiple, sometimes competing priorities and managing them with a calm, collected approach Familiarity with Coupa, Ariba, Ivalua, GEP, OR other S2P application suites is a plus Familiarity with data visualization tools (Power BI Qlik Tableau) is a plus Job Level - Individual Contributor Work Shift Timings - 2:00 PM 11:00 PM IST
Posted 3 days ago
5.0 - 9.0 years
11 - 16 Lacs
Bengaluru
Work from Office
Description AOP (Analytics Operations and Programs) team powers the Data and Insights for ROW Operations Team is responsible for development of business-critical BI and analytics solutions for Operations transcending all miles (barring Supply chain), This role is the single-threaded leader responsible for building reporting and analytics capabilities for Ship with Amazon for NA and EU and ROW finance which includes countries IN, AU, SG, MX, BR and AMET The incumbent is responsible to enable Operations teams meet their respective S-team/VP goals and charters The incumbent is a trusted-partner to respective Operations teams by sharing business-leading insights and prescriptive actions requiring the incumbent to be fully-aware of Org-level strategic direction, key priorities and challenges [LP Indexed: Earn trust] Primary objective of this role is to provide visibility on Operations performance and enable Operations to make data-driven decisions through developing i) Reporting capabilities involving development of automated and self-service reports/dashboards; ii) Analytics Capabilities involving building of tools, science-based artefacts and defect attribution/bridges Specifically, the role is intended to i) Own business critical (xBR, Peak, Contingency) reporting; ii) Be a partner in enabling launch of new programs/features/products by creating right metrics that showcase its adoption and success; iii) Build capabilities that benchmark and improve operations performance iv) Build critical BI technology/tools aiding metric optimization, simulations, real-time tracking and alerting; v) Build defect attribution bridges for key operations metrics to enable goal setting and tracking; vi) Build datasets on Operations performance for ROW finance and enable WBR reporting Another critical objective of this role is to drive BI Artefacts and Practice Standardization for ROW This will involve i) Understand landscape and define the BI Standards; ii) Drive ROW-level Standardization of Key Artefacts; iii) Influence and execute on this agenda by partnering with other teams; iv) Set BI Engineering standards and practice for us to scale support non-linearly, A day in the life A day in the life driving parity across marketplaces, standardization and centralized development of capabilities are the guiding principles to scale the support non-linearly to newer scope and marketplaces The incumbent is staffed with a team of ~10 members spread across India (Blr, Hyd) locations The incumbent needs to decide right mix of job roles for the scope and right locations considering HC costs and outcomes The incumbent should actively provide career choice opportunities in line with the emerging aspirations of team members to have diverse exposure to analytics technologies and laterally move members to roles with high technical complexity ( e-g , BA -> BIE, BIE -> DS) The incumbent needs to keep the team motivated, build and retain expertise, codify processes for new employees to scale up and attract talent into the team Given wide number of stakeholders, incumbent should balance their needs while maintaining work-life harmony of team members, should drive a culture of celebrating wins (small and large) and recognition, and foster strong collaboration across members [LP: Hire and Develop, Strive to be earths best employer], About the teamAOP (Analytics Operations and Programs) team powers the Data and Insights for ROW Operations Team is responsible for development of business-critical BI and analytics solutions for Operations transcending all miles The proposed role within AOP is a critical one requiring to lead FC Analytics for Row Geographies Basic Qualifications 10+ years of business intelligence and analytics experience 5+ years of delivering results managing a business intelligence or analytics team, including employee development and performance management experience Experience in delivering results managing a business intelligence or analytics team, including employee development and performance management Experience working directly with business stakeholders to translate between data and business needs Experience with data visualization using Tableau, Quicksight, or similar tools Experience in data warehouse technical architectures, data modeling, infrastructure components, ETL/ ELT and reporting/analytic tools and environments, data structures and hands-on SQL coding Preferred Qualifications 10+ years of working with very large data warehousing environment experience 10+ years of quantitative and qualitative data science/business intelligence with significant business impact experience Our inclusive culture empowers Amazonians to deliver the best results for our customers If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, amazon jobs / content / en / how-we-hire / accommodations for more information If the country/region youre applying in isnt listed, please contact your Recruiting Partner, Company ADCI Karnataka Job ID: A3035212 Show
Posted 3 days ago
5.0 - 10.0 years
8 - 13 Lacs
Hyderabad
Work from Office
The Investment Risk Team's primary function is to provide the second line of defense for Investment Risk. We are looking for an Investment Risk Manager to join the Investment Risk Team in India. The primary function of this position is to assess daily derivative usage and liquidity across FT funds, leverage the technical skills to enhance the team's data analytics capabilities. The candidate is expected to achieve Business Intelligence and Engineering by understanding existing internal tools and data warehouses and to identify data quality and reliability improvements and to establish best practices. What are the ongoing responsibilities of this position? Identify, monitor, and communicate issues related to regulatory risks. Leverage the technical skills to enhance the team's data analytics capabilities. Assess daily derivative usage and liquidity across FT funds Identify, reconcile and resolve data issues related to derivative and liquidity calculation. Assess and analyse the accuracy and quality of the underlying data and provide commentary for the risk trends. Design, develop, test, automate, and launch new dashboards and reporting solutions across the Regulatory risk team Ensure reports are delivered to the appropriate client(s) and/or provided via automated processes to downstream systems by the agreed upon date Create and run standard reports and queries Review automated validation controls and complete issue resolution process Respond to ad-hoc requests for portfolio risk statistic information, and perform ad hoc analyses, such as stress tests, sensitivity or hedging analysis given various market conditions Ensure client requirements understood during product launch and account on-boarding and internal systems/processes are updated support requirements Remain current on portfolio and market risk related trends, topics, issues, systems and analytical techniques What qualifications, skills and experience would help someone to be successful? Bachelors degree (Masters preferred) in finance, Computer Science, Mathematics or other quantitative discipline CFA or FRM designation preferred. 5 - 10 years relevant work experience in the Mutual Fund/Financial Services Industry Experience with global regulatory rules including 18f4, SEC 22e-4 and UCITS liquidity requirements. Experience with VaR and other risk metrics Experience in programming languages, preferably VBA or R/Python, Power BI, Tableau or other data visualization tools Experience in SQL experience for data extraction, manipulation and analyses; including complex joins. Experience using financial/risk application/software, Bloomberg, MSCI Barra Risk Model, FactSet, and Morningstar Knowledge of statistical calculations, financial instruments and markets, GIPS Composite Strong verbal and written communications skills Strong attention to details and excellent analytical skills Ability to work independently, perform mathematical calculations (or analysis and to exercise independent judgment consistent with department guidelines Ability to organize and prioritize workflow and to coordinate the work of others Ability to maintain updated knowledge of procedures, products and activities of assigned area Ability to accurately proofread documents and work under pressure, and the ability to perform multiple tasks in a fast-paced, team environment. Job Level - Individual Contributor Work Shift Timings - 2:00 PM 11:00 PM IST
Posted 3 days ago
3.0 - 5.0 years
8 - 12 Lacs
Hyderabad
Work from Office
What is IBOR Services - responsible for? The team is responsible for a variety of investment accounting functions such as recording cash-related transactions in the SimCorp Dimension application, reconciling cash and security positions to various custodian records, and ensuring the portfolios are in good order for Portfolio Managers. What is the Senior Analyst - IBOR in the IBOR responsible for? The Senior Analyst is responsible for reviewing all IBOR Services Operations through process and data analysis in order to ensure that appropriate internal controls are in place. This may include reconciling, analysis, and/or reporting. To resolve non-routine problems in a timely manner in order to minimize financial and operational risk exposure. To support the IBOR Services teams initiatives, providing leadership and expertise in all key functions related to IBOR Services. May assist with the planning and administration of the daily work assigned to staff in order to ensure it is completed in accordance with departmental guidelines. What are the ongoing responsibilities of the Senior Analyst - IBOR? Core Responsibilities: Ensure timely completion of reconciliations, daily/periodic processes within a specified line group. Review and provide signoffs for accounting transactions which breaches specified thresholds. Support analysts in the resolution of cash and/or security breaks in a timely manner through use of effective communication skills. Monitor daily and monthly reporting requirements to ensure department deliverables are met. PrepareReview applicable reporting to internal and external entities. Propose procedure revisions as weaknesses and inefficiencies are identified. Assist supervisor in staff and workflow planning to ensure proper coverage of daily work within a specified line group. Provide training/cross training to new/existing team members Actively play a key role in project and process implementation, i.e., requirements gathering, gap analysis, roll out process/procedures, training, etc. Problem solving, decision-making and analytical skills: Analyze, identify, and report trends in a timely manner. Recommend ways to minimize the reoccurrence of any exceptions noted. Guide team in resolving non-routine problems and escalate to the supervisor and/or manager. Ensure timely resolution of these issues considering impact to other areas and sites. Liaise with internal and external teams to resolve issues and discrepancies Test and recommend process or product changes to maximize system efficiencies or enhancements and ensure that appropriate internal controls are in place. Other Responsibilities: Assist in compilation of management reporting such as performance metrics and ad hoc reporting. Complete tracking of assigned goals for performance management. Offer suggestions for improvement to department workflows. Attend, participate in and provide feedback for department meetings Work on special projects as assigned Assist with maintaining up-to-date department procedures. What ideal qualifications, skills & experience would help someone to be successful? Bachelor's Degree or equivalent experience in Business, Accounting or Finance preferred 3-5 years of accounting experience in the financial services industry Basic knowledge of mutual fund industry regulations and accounting standards Good knowledge of MS Excel and other Microsoft Office applications Able to work independently, take initiative and demonstrate accountability Good analytical and organizational skills Good verbal and written communication skills Experience with the SimCorp Dimension Accounting System and Reconciliation Manager would be viewed favorably Proficiency in business intelligence tools (VBA, Power Apps. Alteryx) preferred. Work Shift Timings - 2:00 PM - 11:00 PM IST Experience our welcoming culture and reach your professional and personal potential!
Posted 3 days ago
3.0 - 6.0 years
14 - 19 Lacs
Gurugram
Work from Office
Job Purpose and Impact The Analytics and Reporting Specialist I, will develop and maintain basic analysis and reporting solutions for performance management. In this role, you will partner with business stakeholders to provide accurate data, analysis models and reports. You will also provide technical support to analysts to troubleshoot data and assist with model or report issues. Key Accountabilities Assist the team to utilize data to understand current trends and conditions. Gather, verify and organize data for consumption and ensure data is complete, clean, accurate and properly structured. Perform report or analysis model maintenance and support. Collaborate with the team to understand events and activities reflected in the data. Perform basic troubleshooting, administration and optimization on reports and dashboards. Handle basic issues and problems under direct supervision, while escalating more complex issues to appropriate staff. Other duties as assigned Qualifications Bachelors degree in a related field or equivalent experience Other minimum qualifications may apply Certification in programing language or business intelligence tools Confirmed skills using reporting or data analysis tools Confirmed skills in writing or modifying database queries
Posted 3 days ago
8.0 - 13.0 years
25 - 40 Lacs
Mumbai, Hyderabad
Work from Office
Essential Services: Role & Location fungibility At ICICI Bank, we believe in serving our customers beyond our role definition, product boundaries, and domain limitations through our philosophy of customer 360-degree. In essence, this captures our belief in serving the entire banking needs of our customers as One Bank, One Team . To achieve this, employees at ICICI Bank are expected to be role and location-fungible with the understanding that Banking is an essential service . The role descriptions give you an overview of the responsibilities, it is only directional and guiding in nature. About the Role: As a Data Warehouse Architect, you will be responsible for managing and enhancing data warehouse that manages large volume of customer-life cycle data flowing in from various applications within guardrails of risk and compliance. You will be managing the day-to-day operations of data warehouse i.e. Vertica. In this role responsibility, you will manage a team of data warehouse engineers to develop data modelling, designing ETL data pipeline, issue management, upgrades, performance fine-tuning, migration, governance and security framework of the data warehouse. This role enables the Bank to maintain huge data sets in a structured manner that is amenable for data intelligence. The data warehouse supports numerous information systems used by various business groups to derive insights. As a natural progression, the data warehouses will be gradually migrated to Data Lake enabling better analytical advantage. The role holder will also be responsible for guiding the team towards this migration. Key Responsibilities: Data Pipeline Design: Responsible for designing and developing ETL data pipelines that can help in organising large volumes of data. Use of data warehousing technologies to ensure that the data warehouse is efficient, scalable, and secure. Issue Management: Responsible for ensuring that the data warehouse is running smoothly. Monitor system performance, diagnose and troubleshoot issues, and make necessary changes to optimize system performance. Collaboration: Collaborate with cross-functional teams to implement upgrades, migrations and continuous improvements. Data Integration and Processing: Responsible for processing, cleaning, and integrating large data sets from various sources to ensure that the data is accurate, complete, and consistent. Data Modelling: Responsible for designing and implementing data modelling solutions to ensure that the organizations data is properly structured and organized for analysis. Key Qualifications & Skills: Education Qualification: B.E./B. Tech. in Computer Science, Information Technology or equivalent domain with 10 to 12 years of experience and at least 5 years or relevant work experience in Datawarehouse/ mining/BI/MIS. Experience in Data Warehousing: Knowledge on ETL and data technologies and outline future vision in OLTP, OLAP (Oracle / MS SQL). Data Modelling, Data Analysis and Visualization experience (Analytical tools experience like Power BI / SAS / ClickView / Tableu etc). Good to have exposure to Azure Cloud Data platform services like COSMOS, Azure Data Lake, Azure Synapse, and Azure Data factory. Synergize with the Team: Regular interaction with business/product/functional teams to create mobility solutions. Certification: Azure certified DP 900, PL 300, DP 203 or any other Data platform/Data Analyst certifications. About the Business Group The Technology Group at ICICI Bank is at the forefront of our operations and offerings, which are focused on leveraging state-of-the-art technology to provide customer-centric solutions. This group plays a pivotal role in our vision of the transition from Bank to Bank Tech. Further, the group offers round-the-clock support to our entire banking ecosystem. In our persistent efforts to provide products and solutions that genuinely touch customers, unlocking the potential of technology in every single engagement would go a long way in creating customer delight. In this endeavor, we also tirelessly ensure all our processes, systems, and infrastructure are very well within the guardrails of data security, privacy, and relevant regulations.
Posted 3 days ago
5.0 - 10.0 years
25 - 40 Lacs
Gurugram
Work from Office
Job Title: Data Engineer Job Type: Full-time Department: Data Engineering / Data Science Reports To: Data Engineering Manager / Chief Data Officer About the Role: We are looking for a talented Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, building, and maintaining robust data pipelines and systems that process and store large volumes of data. You will collaborate closely with data scientists, analysts, and business stakeholders to deliver high-quality, actionable data solutions. This role requires a strong background in data engineering, database technologies, and cloud platforms, along with the ability to work in an Agile environment to drive data initiatives forward. Responsibilities: Design, build, and maintain scalable and efficient data pipelines that move, transform, and store large datasets. Develop and optimize ETL processes using tools such as Apache Spark , Apache Kafka , or AWS Glue . Work with SQL and NoSQL databases to ensure the availability, consistency, and reliability of data. Collaborate with data scientists and analysts to ensure data requirements and quality standards are met. Design and implement data models, schemas, and architectures for data lakes and data warehouses. Automate manual data processes to improve efficiency and data processing speed. Ensure data security, privacy, and compliance with industry standards and regulations. Continuously evaluate and integrate new tools and technologies to enhance data engineering processes. Troubleshoot and resolve data quality and performance issues. Participate in code reviews and contribute to a culture of best practices in data engineering. Requirements: 3-10 years of experience as a Data Engineer or in a similar role. Strong proficiency in SQL and experience with NoSQL databases (e.g., MongoDB, Cassandra). Experience with big data technologies such as Apache Hadoop , Spark , Hive , and Kafka . Hands-on experience with cloud platforms like AWS , Azure , or Google Cloud . Proficiency in Python , Java , or Scala for data processing and scripting. Familiarity with data warehousing concepts, tools, and technologies (e.g., Snowflake , Redshift , BigQuery ). Experience working with data modeling, data lakes, and data pipelines. Solid understanding of data governance, data privacy, and security best practices. Strong problem-solving and debugging skills. Ability to work in an Agile development environment. Excellent communication skills and the ability to work cross-functionally.
Posted 3 days ago
15.0 - 19.0 years
0 Lacs
pune, maharashtra
On-site
Are you an analytic thinker who enjoys creating valuable insights with data Do you want to play a key role in transforming our firm into an agile organization At UBS, we re-imagine the way we work, connect with each other - our colleagues, clients, and partners - and deliver value. Being agile will make us more responsive, adaptable, and ultimately more innovative. We are looking for a Data Engineer to transform data into valuable insights that inform business decisions, utilizing our internal data platforms and applying appropriate analytical techniques. You will be responsible for engineering reliable data pipelines for sourcing, processing, distributing, and storing data in different ways, effectively using data platform infrastructure. Additionally, you will develop, train, and apply machine-learning models to make better predictions, automate manual processes, and solve challenging business problems. Ensuring the quality, security, reliability, and compliance of our solutions by applying digital principles and implementing both functional and non-functional requirements is a key aspect of this role. You will also be involved in building observability into our solutions, monitoring production health, helping to resolve incidents, and remediating the root cause of risks and issues while understanding, representing, and advocating for client needs. The WMA Data Foundational Platforms & Services Crew is the fuel for the WMA CDIO, providing the foundational, disruptive, and modern platform and technologies. The mission is rooted in the value proposition of a shared, foundational platform across UBS to maximize business value. To be successful in this role, you should have a bachelor's degree in Computer Science, Engineering, or a related field, along with 15+ years of experience in strong proficiency with Azure cloud services related to data and analytics (Azure SQL, Data Lake, Data Factory, Databricks, etc.). Experience with SQL and data modeling, as well as familiarity with NoSQL databases, is essential. Proficiency in programming languages such as Python or Scala, and knowledge of data warehousing and data lake concepts and technologies are also required. UBS, the world's largest and only truly global wealth manager, operates through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management, and the Investment Bank. With a presence in all major financial centers in more than 50 countries, our global reach and expertise set us apart from our competitors. At UBS, we value our people and their diverse skills, experiences, and backgrounds as the driving force behind our ongoing success. We are dedicated to our craft, passionate about putting our people first, offering new challenges, a supportive team, opportunities to grow, and flexible working options when possible. Our inclusive culture brings out the best in our employees at every stage of their career journey. Collaboration is at the heart of everything we do because together, we are more than ourselves. UBS is committed to disability inclusion, and if you need reasonable accommodation/adjustments throughout our recruitment process, you can always contact us. UBS is an Equal Opportunity Employer that respects and seeks to empower each individual, supporting diverse cultures, perspectives, skills, and experiences within our workforce.,
Posted 3 days ago
0.0 - 4.0 years
0 Lacs
andhra pradesh
On-site
We are seeking freshers for the position of Business Analytics Reporting / ETL developer in our organization. As part of this role, you will be involved in end-to-end BI projects encompassing data consolidation, data modeling, reporting, and visualization development. Following a 4-month training period, you will have the opportunity to explore various responsibilities, including client interaction and supporting leads in meeting client needs. Your responsibilities will include providing assistance in data analytics and BI deployments, documenting specifications, performing data validation and quality checks, developing scripts for extraction, transformation, and loading, conducting testing and validation against requirements and source data, designing and developing dashboards, as well as publishing and scheduling business and operational reports. To excel in this role, you must be committed to remaining with the company for at least 2 years post the training period. Basic knowledge of databases and SQL is essential, along with strong written and oral communication skills. Familiarity with data warehousing concepts is desirable. Additionally, possessing good interpersonal skills to foster successful client relationships, especially with clients in remote locations, is crucial. Attention to detail, organizational skills, ability to work under pressure, meet deadlines consistently, and a proactive attitude towards tasks are valued qualities. Taking ownership of specific work areas and demonstrating promptness and initiative are key attributes we are looking for in potential candidates. Education Qualifications: BCA / B. Sc Computers / B. Tech / B. E. / MCA This position is based in Visakhapatnam.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As an ETL Data Engineer, you will be responsible for leveraging your advanced database skills and extensive experience in T-SQL, performance tuning, and optimization to streamline data integration and transformation processes using ETL tools such as SSIS. With 5-7 years of experience, you will have a deep understanding of data warehousing, business intelligence, and dimensional modeling principles. Your role will involve managing multiple deliverables and projects simultaneously, showcasing your ability to multitask effectively. You will be expected to participate in initial profile screenings and progress through L1 and L2 level technical interview rounds, with the L2 interview being a face-to-face discussion in Bangalore. To excel in this role, you must possess excellent communication and collaboration skills, enabling you to work efficiently within a team. Additionally, you should be proficient in comprehensive data warehousing and business intelligence, ensuring the optimization of data processes. Please note that PAN and Date of Birth details are mandatory for application submission, and your CV should be in MS Word format. If you are based in Bangalore, you will have the opportunity to participate in an in-person L2 interview discussion. If you meet the above requirements and are enthusiastic about utilizing your expertise in ETL data engineering, we encourage you to apply for this exciting opportunity.,
Posted 3 days ago
4.0 - 8.0 years
0 Lacs
maharashtra
On-site
As a Tableau Developer with a minimum of 4 years of experience in a similar data visualization role specialized in Tableau, you will be a valuable addition to our team in Worli, Mumbai. Your primary responsibilities will include preparing and structuring data for use in Tableau, creating and optimizing Tableau dashboards, generating insightful data reports, and collaborating closely with various teams to meet business requirements. If you are a data-driven professional with a passion for visualization and reporting, we are excited to hear from you! In this role, you will have the opportunity to work on a variety of tasks such as establishing and maintaining connections to various data sources, developing workbooks and Hyper Extracts, publishing workbooks and data sources to Tableau Server, and conducting unit testing to ensure data accuracy and functionality. You will also be responsible for troubleshooting and resolving performance or data-related issues in Tableau workbooks and data sources, as well as providing knowledge transfer and support to end-users. To be successful in this role, you must have hands-on experience in creating table calculations, functions, filters, calculated fields, parameters, reference lines, bins, sets, groups, and hierarchies in Tableau. Expertise in developing interactive dashboards, data blending, and performance tuning of Tableau dashboards is essential. Additionally, you should be proficient in writing and optimizing SQL queries for relational databases, connecting to multiple data sources, ensuring data integrity, and updating data as needed. If you are skilled in Tableau Desktop and Tableau Server, have experience with Python automation and ETL processes, and possess proficiency in Microsoft Excel, including advanced functions, pivot tables, and data manipulation, we encourage you to apply. Strong communication, problem-solving, and critical-thinking skills are also key requirements for this role. Join our dynamic and innovative environment where you will collaborate with a talented team of professionals, have opportunities for growth and learning in the field of data visualization and business intelligence, and receive competitive compensation and benefits. If you are passionate about data, analytics, and creating impactful visualizations, submit your updated resume and a brief cover letter outlining your experience in Tableau development and data visualization to apply.,
Posted 3 days ago
5.0 - 10.0 years
0 Lacs
karnataka
On-site
You are a Senior developer with 5 to 10 years of experience possessing Full Database/Datawarehouse/DataMart development capabilities. Your expertise includes proficiency with Azure Data Factory and database programming, encompassing stored procedures, SSIS development, and User Defined Data Types. As a Senior Subject Matter Expert (SME) with SQL Server, you are instrumental in handling standard file formats and markup language technology like JSON, XML, and XSLT. Your proficiency extends to Azure Cloud data environments, making you a valuable asset. Your exceptional verbal communication skills, along with good problem-solving abilities and attention to detail, set you apart. Additionally, your expertise in Data Warehousing principles and design further bolsters your profile. Nice to have skills include familiarity with development frameworks such as .NET and Azure DevOps, as well as Databricks and Unity Catalog. Exposure to other database technologies like Mongo, Cosmos, MySQL, or Oracle is a plus. Knowledge of fundamental front-end languages such as HTML, CSS, and JavaScript, along with familiarity with JavaScript frameworks like Angular JS, React, and Amber, is advantageous. Moreover, being acquainted with server-side languages such as Python, Ruby, Java, PHP, and .Net would be beneficial.,
Posted 3 days ago
5.0 - 15.0 years
0 Lacs
hyderabad, telangana
On-site
You have extensive IT experience of 10-15 years in implementing BI solutions using Oracle Data Technologies such as ODI, OCI-Data Integrations, OIC, ADW, OBIA/FAW, and OAC/Tableau. Additionally, you possess Oracle Cloud Expertise of at least 5+ years with Oracle Cloud-based data technologies and services including OAC, ADW, Object Storage, FAW, and Data Lake on OCI. Your experience includes EBS or Fusion ERP Analytics in at least two end-to-end project implementations. You have strong industry domain knowledge expertise in Life Sciences, Healthcare, Pharmacy, Banking, and Finance. In this role, you will work closely with business users, IT teams, and management to understand data requirements and reporting needs. Your responsibilities will include designing data warehouse architecture, including data models, ETL processes, and Oracle Cloud Native Integration Flows. You will be a Subject Matter Expert & Onshore coordinator providing guidance to OAC & ODI teams and collaborating with functional teams on data models and source table configurations. It will be crucial for you to ensure data models align with business requirements and support reporting and analytical needs. You will work with developers, DBAs, and IT staff to implement data warehouse solutions according to the designed architecture. Additionally, you will be responsible for creating and maintaining detailed documentation for architecture, data models, ETL processes, and configurations. Providing end-user training and support for reporting and analysis, as well as assisting with troubleshooting, will also be part of your responsibilities. It is essential to stay updated on the latest data warehousing technologies and continuously seek opportunities for performance and process improvements. You should have primary skills in Data Warehousing, Data Modeling, SQL, FAW/FDI, Data Visualizations like Tableau, Oracle DV, OAC/OAS, and Data Integrations like ODICS/ODI, Informatica, ADF/SSIS. Good to have experience in BI, Data Warehousing, Data Modeling design, and BI Implementations. Knowledge of cloud computing platforms and deployment (Oracle Cloud, AWS, GCP, Azure) will be beneficial. It is also good to have Data Engineering/ETL skill set in any of the technological stack like Azure Databricks, Snowflake, etc. If you meet the requirements mentioned above and are interested in this position, please share your resume to: jobs@360extech.com,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a member of the Global IT SAP Team at Shure, you will play a vital role in driving business transformation and maximizing business value through the implementation and support of SAP solutions. Reporting to the Associate Director, SAP Business Applications Finance, you will collaborate with internal IT associates and business users globally to build, enhance, and support solutions that align with industry best practices and technology trends. Your responsibilities will include contributing to requirement gathering, solution design, configuration, testing, and implementation of end-to-end solutions. You will work closely with business stakeholders to understand requirements, provide deep SAP functional expertise, and analyze key integration points. Adhering to IT guiding principles, you will focus on leveraging standard processes, minimizing customization, and driving positive customer experiences. As a SAP Senior Analyst Finance, you will stay updated on evolving SAP technologies, propose innovative solutions, and provide impact analysis on enhancements or new solutions. Additionally, you will offer application support, collaborate with the SAP development and security teams, and ensure compliance with security and data standards. To qualify for this role, you should hold a Bachelor's degree in Finance, Computer Science, or a related field, with a minimum of 5 years of experience in enterprise systems implementation, specifically in SAP FICO S4HANA. Experience with data warehousing platforms and tools is desirable, along with a strong understanding of SAP modules, technical components, and project management methodologies. Key competencies for success in this role include adaptability, critical thinking, customer focus, decision quality, communication, leadership, drive for results, integrity, relationship building, analytical skills, teamwork, collaboration, and influence. Your ability to quickly learn new concepts, follow operational policies, and travel to remote facilities when required will be essential. At Shure, we are committed to being the most trusted audio brand worldwide, driven by our Core Values of quality, reliability, and innovation. If you are passionate about creating an inclusive and diverse work environment and possess the skills to excel in this role, we encourage you to apply and join our team.,
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough