Jobs
Interviews

52 Etl Workflows Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

5 - 10 Lacs

chennai, tamil nadu, india

On-site

As a Specialist in Data Visualization, you will be focused on designing and developing compelling data visualizations solutions to enable actionable insights & facilitate intuitive information consumption for internal business stakeholders. The ideal candidate will demonstrate competency in building user-centric visuals & dashboards that empower stakeholders with data driven insights & decision-making capability. Responsibilities: Develop user-centric scalable visualization and analytical solutions, leveraging complex data sources to create intuitive and insightful dashboards. Apply best practices in visualization design to enhance user experience and business impact. Drive business engagement, collaborating with stakeholders to define key metrics, KPIs, and reporting needs. Facilitate workshops to develop user stories, wireframes, and interactive visualizations. Partner with data engineering, data science, and IT teams to develop scalable business-friendly reporting solutions. Ensure adherence to data governance, privacy, and security best practices. Identify opportunities for automation, streamlining manual reporting processes through modern visualization technologies and self-service analytics enablement. Provide thought leadership, driving knowledge-sharing within the Data & Analytics organization while staying ahead of industry trends to enhance visualization capabilities. Continuously innovative on visualization best practices & technologies by reviewing external resources & marketplace Ensuring timely delivery of high-quality outputs, while creating and maintaining SOPs, KPI libraries, and other essential governance documents. Required Experience and skills: 5+ years of experience in business intelligence, insight generation, business analytics, data visualization, infographics, and interactive visual storytelling Hands-on expertise in BI and visualization tools such as Power BI, MicroStrategy, and ThoughtSpot Solid understanding of data engineering and modeling, including ETL workflows, Dataiku, Databricks, Informatica, and database technologies like Redshift and Snowflake, with programming skills in SQL and Python. Deep knowledge of pharma commercial data sources, including IQVIA, APLD, Claims, Payer, Salesforce, Financials, Veeva, Komodo, IPSOS, and other industry datasets to drive strategic insights. Experience in pharmaceutical commercial analytics, including Field Force Effectiveness, customer engagement, market performance assessment, as well as web, campaign, and digital engagement analytics. Strong problem-solving, communication, and project management skills, with the ability to translate complex data into actionable insights and navigate complex matrix environments efficiently. Strong product management mindset, ensuring analytical solutions are scalable, user-centric, and aligned with business needs, with experience in defining product roadmaps and managing solution lifecycles. Expertise in agile ways of working, including Agile/Scrum methodologies, iterative development, and continuous improvement in data visualization and analytics solutions.

Posted 12 hours ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

Role Overview: As an ETL Lead at NTT DATA, you will be responsible for designing, developing, and optimizing ETL workflows using Matillion for cloud-based data platforms. Your role will involve working on enterprise data warehouse and data marts, utilizing SQL, Snowflake, and Matillion ETL tool. You will also be expected to have a strong understanding of data warehousing concepts and be able to mentor junior engineers and analysts. Key Responsibilities: - Design, develop, and optimize ETL workflows using Matillion for cloud-based data platforms - Work on enterprise data warehouse and data marts using SQL, Snowflake, and Matillion - Develop features for enterprise-level data warehouse and ensure knowledge of data warehousing concepts - Experience with minimum 2 full implementation cycles from analysis to deployment and support - Sound understanding of Data Warehouse concepts like Slowly Changing Dimensions, Facts, SCD 1, SCD2 implementations - Strong knowledge of Tableau, Cognos, and Qlik - Command of data integration, data virtualization, and data warehousing - Mentor junior engineers and analysts to foster a culture of innovation and excellence - Adapt communication style to technical and non-technical audiences - Self-manage workload and priorities effectively - Collaborate with cross-functional teams to deliver data-driven solutions - Good to have: Advanced Data Engineering utilizing SQL, Python, and PySpark for data transformation and analysis Qualifications Required: - 8+ years of experience with Enterprise Data Warehouse & Data Marts, SQL, and Matillion ETL tool - Strong work experience in SQL, Snowflake, and Matillion - Excellent verbal and written communication skills with high attention to detail - Self-motivated, driven individual comfortable working in a fast-paced environment (Note: No additional details about the company were present in the job description.),

Posted 3 days ago

Apply

5.0 - 7.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Job Description : YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we're a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hirePalantir Professionals in the following areas : Job description: We are seeking a highly motivated and technically skilled Senior Data Engineer to join our data team. The ideal candidate will have extensive experience designing, building, and optimizing scalable data pipelines and analytics solutions using Palantir Foundry, along with broader expertise in data architecture, governance, and processing. This role offers the opportunity to work on cutting-edge data platforms and deliver impactful solutions for enterprise data integration, transformation, and advanced analytics. Key Responsibilities: Design, implement, and maintain scalable and reliable data pipelines using Palantir Foundry. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and translate them into technical solutions. Build reusable data assets, integrate data from multiple sources, and ensure data quality, integrity, and security. Develop and maintain transformation workflows, ontology models, and data governance frameworks within Foundry. Optimize data pipelines for performance, scalability, and cost efficiency. Troubleshoot, monitor, and resolve data issues and pipeline failures in real-time. Implement best practices in data versioning, lineage, and observability. Assist in defining data architecture strategies and tooling roadmaps. Mentor junior engineers and contribute to team knowledge sharing and documentation. Required Qualifications: 5+ years of experience in data engineering, analytics, or software engineering roles. Strong hands-on experience with Palantir Foundry, including ontology modeling, pipelines, data fusion, and transformation logic. Proficient in Python, SQL, and data processing frameworks. Solid understanding of data integration, ETL workflows, and data modeling principles. Experience working with relational databases, data lakes, cloud data warehouses (AWS, Azure, GCP). Strong problem-solving skills and ability to debug complex data workflows. Familiarity with CI/CD pipelines and version control tools like Git. Excellent communication skills and ability to work collaboratively with cross-functional teams. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment.We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 3 days ago

Apply

0.0 years

0 Lacs

gurugram, haryana, india

On-site

Ready to build the future with AI At Genpact, we don't just keep up with technology-we set the pace. AI and digital innovation are redefining industries, and we're leading the charge. Genpact's AI Gigafactory, our industry-first accelerator, is an example of how we're scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what's possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Manager, Master Data Management We are looking for a highly skilled SQL and Python expert to design, develop, and optimize data solutions that support analytics, reporting, and automation. The role will focus on managing complex data pipelines, ensuring data accuracy, and enabling business insights through scalable solutions. Responsibilities .MDM management. MDM tool: Profisee. Manage / maintain dimensions and master data .Data Visualization Tools: Expertise Looker Studio .Hierarchy Management: Understand business and manage data hierarchies and mapping, particularly in the sales environment. .Optimize database performance, troubleshoot issues, and ensure data quality. .Collaborate with data analysts, BI developers, and business stakeholders to deliver accurate datasets for reporting and insights. .Responsible to upload the right data sets and check whether the output is in line with the business expectations Minimum Qualifications .Bachelor's degree or higher. .Hands-on experience in SQL development and database management. .Strong expertise in Python programming for data processing and automation. .Solid understanding of data modeling, ETL workflows, and query optimization. .Excellent communication and analytical skills. Preferred Qualifications/ Skills .Experience with data visualization tools (Looker, or Tableau). .Familiarity with cloud platforms (Azure, AWS, or GCP) and data services. .Exposure to pandas, NumPy, and other Python data libraries. .Strong problem-solving skills, with ability to work on large datasets and complex business logic. .Proven ability to lead cross-functional projects and mentor teams. .Knowledge of AI/ML Concepts Why join Genpact .Lead AI-first transformation - Build and scale AI solutions that redefine industries .Make an impact - Drive change for global enterprises and solve business challenges that matter .Accelerate your career-Gain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills .Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace .Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything we build .Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inlusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let's build tomorrow together.Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 5 days ago

Apply

6.0 - 8.0 years

15 - 25 Lacs

hyderabad, pune, gurugram

Work from Office

HIRING- SR. DATA ENGINEER LOC-Hyderabad,Chennai,Pune,Bangalore,Gurgaon EXP-6-8yrs CTC-Upto 28lpa NP-Imm-30days SKILLS-SQL,ETL,Data Transformation,SSIS,SSRS,PerformanceTuning,DataValidation& Reconciliation Drop your CV at rashibimaginators@gmail.com Required Candidate profile SKILLS-Senior Data Engineer (MSBI / ETL)SQL, ETL, Data Transformation, SSIS, SSRS, Performance Tuning, Data Validation & Reconciliation, ETL workflows, Build reports and dashboards

Posted 1 week ago

Apply

5.0 - 10.0 years

6 - 9 Lacs

hyderabad, telangana, india

On-site

Tech Stalwart Solution Private Limited is looking for Sr GCP Data Engineering to join our dynamic team and embark on a rewarding career journey Build, manage, and optimize data pipelines on Google Cloud Platform Develop ETL workflows, integrate cloud-based tools, and ensure data quality and scalability Collaborate with analytics and engineering teams to deliver data-driven insights while maintaining compliance and data security standards

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You are looking for a BI Project Manager to join your team in Hyderabad. As the Project Manager, you will be responsible for overseeing finance-related projects, ETL processes, and Power BI reporting. Your role will involve leading project lifecycles, collaborating with cross-functional teams, and translating business requirements into actionable insights. Your key responsibilities will include managing end-to-end project lifecycles for financial data and reporting initiatives, defining project scope and deliverables in collaboration with various teams, ensuring the accuracy of financial data through ETL workflows, designing and developing Power BI dashboards and reports, tracking project progress, mitigating risks, and delivering projects within budget and timelines. Additionally, you will be required to translate business requirements into technical specifications, conduct stakeholder meetings, and ensure data compliance and security protocols are followed. To qualify for this role, you should have a Bachelors or Masters degree in Finance, Computer Science, Information Systems, or a related field, along with at least 5 years of project management experience, preferably in the financial or banking sectors. You should possess a strong understanding of ETL processes, data pipelines, and data warehousing concepts, as well as hands-on experience with Power BI tools such as DAX, Power Query, data modeling, and report publishing. Your track record should demonstrate successful project deliveries within deadlines, and you should have excellent communication, problem-solving, and stakeholder management skills. Experience with Agile or Scrum methodologies will be an added advantage.,

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

hyderabad, telangana

On-site

You are a PySpark Developer with over 7 years of experience and expertise in Reltio MDM. As a part of the Data Engineering team, your primary responsibility will be to design, build, and optimize scalable data pipelines, ensuring seamless integration with Reltio's cloud-native MDM platform. This is an immediate requirement, and we are looking for candidates who can join us promptly. Your key responsibilities will include designing, developing, and maintaining scalable data pipelines using PySpark in distributed computing environments like AWS EMR and Databricks. You will also be responsible for integrating and synchronizing data between enterprise systems and the Reltio MDM platform, implementing data transformation processes, and collaborating with various stakeholders for effective data modeling. Additionally, you will work on API-based integrations between Reltio and other applications, optimize PySpark jobs for performance and cost-efficiency, and ensure data quality and governance across workflows. To excel in this role, you should possess at least 7 years of hands-on experience in PySpark development and distributed data processing. Strong expertise in Apache Spark, DataFrames, and Spark SQL is essential, along with proven experience in Reltio MDM, REST APIs, and working with JSON data formats. Experience with cloud platforms, particularly AWS (S3, Lambda, Step Functions, EMR), data warehousing concepts, ETL workflows, and data modeling is required. Familiarity with CI/CD pipelines, Git, and version control is also beneficial. Strong problem-solving, analytical, and communication skills are key attributes for this role. This opportunity offers you the chance to work on cutting-edge data engineering projects and gain exposure to Reltio MDM, a leading cloud-native MDM platform. You will have the flexibility to work from any location across India (PAN India), making it a convenient and enriching work environment for you.,

Posted 3 weeks ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

bengaluru

Work from Office

Key Responsibilities: Design, develop, and maintain ETL workflows using Informatica. Work with business and technical teams to gather requirements and translate them into technical solutions. Optimize ETL performance, troubleshoot issues, and ensure high-quality data integration. Develop and maintain SQL queries, stored procedures, and scripts for data validation and transformation. Perform unit testing, system integration testing, and support UAT. Provide production support, resolve incidents, and ensure timely delivery. Collaborate with cross-functional teams to deliver end-to-end ETL solutions. Good to Have: Exposure to Data Warehousing concepts Knowledge of cloud-based ETL tools and platforms

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As an AFC Transaction Monitoring - Senior Engineer, VP at our Pune location, you will be a part of the Anti-Financial Crime (AFC) Technology team, working within a versatile agile squad. Your primary focus will be on designing, developing, and testing engineering solutions to enhance the Transaction Monitoring (TM) systems, enabling them to detect Money Laundering or Terrorism Financing activities efficiently. Your role will involve managing and optimizing data flows within Transaction Monitoring, working on challenging problems with large complex datasets, and leveraging Cloud and BigData technologies. You will be responsible for optimizing data pipelines, creating new ETL Frameworks, and developing high-performance systems to process substantial data volumes using cutting-edge technologies. Collaborating with a cross-functional agile delivery team, you will bring innovation to software development by incorporating the latest technologies and practices, emphasizing business value. Your approach to engineering will be team-oriented, fostering open code, discussions, and a supportive, collaborative environment. You will contribute to all software delivery stages, from initial analysis to production support. As a Vice President, your leadership responsibilities will include creating efficient ETL workflows, implementing data validation and cleansing techniques, collaborating with developers and architects to design scalable solutions, and ensuring compliance with industry standards and best practices. Your key skills and experience should include strong analytical problem-solving capabilities, excellent communication skills, and experience in Google Cloud Platform, Oracle, Control M, Linux, Agile methodology, and Hadoop. Proficiency in designing and delivering complex ETL pipelines in a regulatory space is essential. At our organization, we offer a range of benefits such as a comprehensive leave policy, parental leaves, childcare assistance, sponsorship for certifications, Employee Assistance Program, Hospitalization and Life Insurance, and health screening. We provide training, coaching, and a culture of continuous learning to support your career progression. Join us at Deutsche Bank Group to excel together every day, act responsibly, think commercially, take initiative, and work collaboratively in a positive, fair, and inclusive work environment. For further information, please visit our company website: [Deutsche Bank](https://www.db.com/company/company.htm),

Posted 4 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

We are looking for a skilled and experienced Adobe Experience Platform (AEP) Developer to join our team. The ideal candidate should have prior experience with Adobe Experience Platform and other Adobe Marketing Cloud solutions like Adobe Target, Adobe Analytics, and Adobe Campaign. We are seeking someone who has expertise in digital targeting and marketing automation technologies. As an AEP Developer, your responsibilities will include analyzing business requirements and translating them into comprehensive AEP solutions. You will need to develop detailed business use cases to demonstrate how AEP functionalities can address specific business needs and enhance customer experiences. Utilizing your expertise in Adobe Experience Platform, you will design, develop, and implement end-to-end business use cases that align with business objectives. Your role will also involve designing and implementing efficient ETL processes and integrations with other systems to ensure a continuous flow of data for marketing activities. Experience with Adobe Journey Optimizer for accurate tracking, targeting, and messaging will be essential. Additionally, you will be responsible for monitoring the performance of AEP solutions, identifying and resolving issues, troubleshooting technical challenges, and optimizing workflows for enhanced efficiency. It is crucial to stay updated with the latest advancements in Adobe Experience Platform and related technologies for continuous improvement. To qualify for this role, you should have a Bachelors or Masters degree in Computer Science, Information Technology, or a related field. You should have at least 4 years of experience in developing and implementing marketing automation solutions, with a focus on Adobe Marketing Cloud and Adobe Experience Platform services. Technical developer certifications across relevant products and expertise in core web technologies are required. You should also have experience in AEP data collection, ingestion, ETL workflows, data preparation development, data governance, data activation, performance optimization, data quality assurance, and AJO journey creation and measurement. Join our dynamic team and contribute to enhancing our marketing operations by leveraging the power of Adobe Experience Platform to deliver personalized experiences to our customers. In addition to the job description, we offer a gender-neutral policy, 18 paid holidays throughout the year, generous parental leave, flexible work arrangements, and employee assistance programs to support your wellness and well-being. Publicis Sapient is a digital transformation partner that helps organizations transition to a digitally-enabled state. Our team combines expertise in technology, data sciences, consulting, and customer experience to accelerate our clients" businesses through innovative solutions. If you are passionate about digital transformation and customer-centric solutions, we invite you to join our team and be part of our mission to help people thrive in the pursuit of next.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As an MDM (Master Data Management) SaaS and On-Prem Engineer, you will be responsible for designing, implementing, and supporting robust MDM systems that ensure seamless integration between on-premises and cloud platforms. Your expertise in MDM tools, data modeling, integration, and governance will be crucial in managing complex data ecosystems effectively. Your key responsibilities will include: - Designing, implementing, and supporting MDM solutions across SaaS and on-premise platforms to maintain a unified view of master data. - Configuring and customizing MDM tools to meet business requirements and ensure compatibility across environments. - Developing and maintaining master data models for various domains such as customers, suppliers, products, and financial data. You will also be tasked with implementing and managing integrations between SaaS MDM platforms and on-premise systems, developing data pipelines, APIs, and ETL workflows, and collaborating with IT and business teams to ensure seamless data flow and system interoperability. Additionally, you will be responsible for establishing and enforcing data governance policies, implementing data quality rules and tools, monitoring the performance of MDM systems, and optimizing workflows and processes for data ingestion, cleansing, and integration. Your role will also involve collaborating with stakeholders, providing training and support to end-users, ensuring compliance with data privacy regulations, maintaining comprehensive documentation of MDM processes, and identifying opportunities for process improvements and automation. To qualify for this role, you should have at least 5 years of hands-on experience in MDM implementation and support, proficiency in MDM tools and technologies for both SaaS and on-premise environments, strong analytical and problem-solving skills, and excellent communication and collaboration abilities. Certifications in MDM tools, cloud platforms, and data governance or data quality would be advantageous. Overall, this position requires a proactive individual with a deep understanding of MDM solutions and technologies, a commitment to data integrity and system performance, and the ability to effectively communicate technical concepts to non-technical stakeholders while collaborating with cross-functional teams to deliver successful MDM solutions.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As an experienced AI/ML Architect, you will play a crucial role in leading the design, development, and deployment of cutting-edge Generative AI solutions. Your deep understanding of AI/ML architectures, technical leadership skills, and expertise in cloud platforms like GCP will be instrumental in designing robust, scalable, and production-ready systems. Collaborating closely with cross-functional teams and stakeholders, you will implement state-of-the-art AI solutions that address real-world challenges and drive business value. Your responsibilities will include providing technical leadership by guiding development teams through best practices and architecture reviews. You will design end-to-end Generative AI solutions, develop data pipelines, train models, deploy solutions, and monitor them in real-time. Leveraging MLOps tools, you will automate workflows for scalable and repeatable deployments. Your role will also involve optimizing deployment pipelines using tools like Docker, Kubernetes, and cloud-native services, ensuring seamless integration into existing CI/CD pipelines. In addition, you will focus on building scalable ETL workflows, implementing data preprocessing, and feature engineering pipelines to prepare data for Generative AI applications. Identifying and implementing optimization strategies for model performance enhancement will be a key aspect of your work. You will also drive the development of Proof of Concepts (POCs) and pilot projects, collaborating with delivery and product teams to scale successful pilots to production-grade solutions. To excel in this role, you should have at least 10 years of experience in AI/ML architecture, model development, and production deployment. Your expertise in designing, implementing, and scaling Generative AI and LLM-based solutions, along with hands-on experience with frameworks like LangChain and RAG, will be invaluable. Proficiency in Python, SQL, MLOps tools, containerization, and CI/CD pipelines is essential. Your strong technical leadership, communication skills, and strategic thinking abilities will be crucial in driving the adoption of Generative AI solutions and enabling continuous improvement. Joining our team will provide you with the opportunity to work on cutting-edge AI solutions, collaborate with a technology-driven team, and drive innovation through technical leadership and mentorship. You will have access to the latest AI/ML technologies and frameworks, ensuring continuous professional growth in this dynamic field.,

Posted 1 month ago

Apply

9.0 - 13.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Architect specializing in AWS Big Data solutions, you will play a crucial role in the design and optimization of complex data architectures. Your primary responsibility will be to leverage AWS services to create efficient and cost-effective Big Data solutions while maintaining a balance between cost and performance. This includes evaluating existing technologies, conducting impact assessments, optimizing designs, and documenting projects. Effective communication of architectural decisions, conducting Proof-of-Concepts, proposing automation solutions, and providing expert advice are integral parts of this role. Key Result Areas and Activities: - **Technology Assessment and Impact Analysis:** Evaluate current technology and data integration frameworks, and perform impact assessments based on project requirements. - **Big Data Use Case Design:** Develop intricate Big Data use cases using AWS services, with a focus on cost-effectiveness, performance, and durability. - **Optimization and Performance:** Recommend optimizations to enhance the balance between cost and performance in existing designs. - **Architectural Communication and Proof-of-Concepts:** Communicate architectural decisions with stakeholders, conduct Proof-of-Concepts, document outcomes, and lead design reviews. - **Process Improvement and Automation:** Implement automation strategies to streamline processes, boost team productivity, and offer expert advice, troubleshooting, and support for client proposals. **Essential Skills:** - Proficiency in AWS services such as S3, EC2, EMR, Athena, AWS Glue, Lambda, as well as AWS VPC, subnets, security groups, and route tables. - Expertise in MPP databases like AWS Redshift, Snowflake, SingleStore, and solid grasp of Data Warehousing concepts. - Strong knowledge of Big Data technologies, particularly Apache Spark, and experience with table formats like Delta Lake or Apache Iceberg. - Excellent programming skills in Scala and Python, proficiency in SQL, and experience with orchestration tools like Apache Airflow. - Skilled in developing ETL workflows with complex transformations like SCD, deduplication, and aggregation. - Familiarity with operating systems including Linux and Windows. **Desirable Skills:** - Previous experience in a large media company would be beneficial. Knowledge of Data Fabric and Data Mesh concepts. - Proficiency in Java, Bash programming languages, and Cloud Databases like AWS Aurora. **Qualifications:** - Bachelor's degree in computer science, engineering, or a related field (Master's degree preferred). - Continuous learning demonstrated through technical certifications or related methods. - 9 years of IT experience with a minimum of 5 years dedicated to cloud-related projects. **Qualities:** - Possess a strong technical background with the ability to conduct in-depth research across various technical domains. - Self-motivated and dedicated to delivering results for a rapidly growing team and organization. - Effective communication skills through written, verbal, and client presentations. - Efficient and well-organized in cross-functional teams and working across different time zones. This position is based in India and requires 9 to 12 years of experience in the field.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

Are you ready to make an impact at DTCC Do you want to work on innovative projects, collaborate with a dynamic and supportive team, and receive investment in your professional development At DTCC, we are at the forefront of innovation in the financial markets. We are committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We foster a thriving internal community and are committed to creating a workplace that looks like the world that we serve. Comprehensive health and life insurance and well-being benefits, based on location. Pension/Retirement benefits. Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The Impact You Will Have In This Role Enterprise Services comprises of multiple business platforms including Client Services, Global Business Operations, Business Architecture, Data Strategy and Analytics, and Digital Services, which report into the Chief of Enterprise Services. These grouped platforms enable the business to optimize delivery for clients, generate efficiencies and resilience, and enable consistency in the business digitization strategy, processes and end-to-end best practices. The skilled Automation Tester is experienced in testing applications developed in Appian, able to validate ETL workflows by querying and comparing result sets and has hands-on knowledge on testing applications developed using RPA tools like BluePrism. The Automation Tester is a self-starter with a strong ability to prioritize, own testing deliverables/timelines, understand various solution components, and clearly and effectively communicate results with the team. What You'll Do - Develop and execute test cases for applications developed in Appian, ensuring comprehensive coverage of both positive and negative scenarios. - Test workflows designed on Talend, focusing on data extraction, transformation, and loading processes. - Validate and verify automation (RPA) solutions developed using BluePrism, ensuring they meet business requirements and function as expected. - Gather and set up required test data for testing, ensuring data integrity and consistency. - Track test results and defects throughout the testing lifecycle, using tools like JIRA for defect management. - Coordinate with the user base for a successful roll-out during the user acceptance test phase, providing clear and concise feedback. - Independently manage multiple projects based on provided priorities to complete testing and provide feedback within given timelines. - Collaborate with other team members and analysts through the delivery cycle, ensuring seamless integration and communication. - Participate in an Agile delivery team that builds high-quality and scalable work products, contributing to sprint planning, reviews, and retrospectives. - Assist in the evaluation of upcoming technologies and contribute to the overall solution design, providing insights and recommendations. - Support production releases and maintenance windows, working closely with the Operations team to ensure smooth deployments. - Align risk and control processes into day to day responsibilities to monitor and mitigate risk; escalates appropriately. Qualifications - Bachelor's degree preferred or equivalent experience. Talents Needed For Success - Minimum of 6 years of related experience in testing automation solutions. - Ability to create Scripts using Python. - Hands-on experience with test automation tools like Selenium, TestComplete, and UFT One. - Experience in using tools like BluePrism, UiPath, and Power Automate. - Strong understanding of SDLC and legacy technologies like MS Access and mainframe systems. - Ability to write and execute SQL queries to validate test results in SQL Server databases. - Experience in testing solutions built on Appian, with a focus on process automation and workflow management.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Join us as a Senior Data Tester at Barclays and be part of the evolution of our digital landscape, driving innovation and excellence. You will utilize cutting-edge technology to transform our digital offerings, ensuring exceptional customer experiences. As a valued team member, you will deliver technology solutions, leveraging strong analytical and problem-solving skills to understand business requirements and provide high-quality solutions. Collaborating with engineers, business analysts, and stakeholders, you will tackle complex technical challenges, requiring detailed analytical skills and analysis. As a Senior Data Tester, your key responsibilities will include: - Leading end-to-end testing of complex ETL workflows, with a specific focus on Ab Initio. - Validating data transformations, integrations, and migrations within Data Warehousing environments. - Designing and executing test cases, test plans, and strategies for Ab Initio and AWS-based data solutions, ensuring adherence to cloud best practices. - Writing and optimizing complex SQL queries for data validation and reconciliation. - Performing root cause analysis and troubleshooting issues across Unix-based systems and cloud platforms. - Collaborating with developers, analysts, and business stakeholders to ensure comprehensive test coverage and traceability. Additionally, highly valued skills for this role include: - A graduate degree. - Excellent communication and analytical skills. - Ability to communicate effectively across various levels and capabilities. - Collaborative mindset, willing to share best practices at all levels. In this role based in Pune, your primary purpose will be to design, develop, and implement testing strategies to validate functionality, performance, and user experience. You will work closely with cross-functional teams to identify and resolve defects, continuously improve testing processes, and ensure software quality and reliability. Your accountabilities will include: - Developing and implementing comprehensive test plans and strategies to validate software functionality and ensure quality standards compliance. - Creating and executing automated test scripts using testing frameworks and tools for early defect detection. - Collaborating with cross-functional teams to analyze requirements, participate in design discussions, and contribute to acceptance criteria development. - Conducting root cause analysis for identified defects, supporting defect resolution with developers. - Participating in code reviews, promoting a culture of code quality and knowledge sharing. - Staying updated on industry technology trends, contributing to technical communities, and fostering a culture of technical excellence and growth. As a Senior Data Tester, you are expected to drive continuous improvement, demonstrate technical expertise, lead and support team members, and contribute to the organization's objectives. You will be responsible for managing risks, strengthening controls, and ensuring compliance with relevant rules and regulations. All colleagues at Barclays are expected to embody the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as demonstrate the Barclays Mindset of Empower, Challenge, and Drive in their behaviors and actions.,

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Pune, Maharashtra, India

On-site

As a Specialist in Data Visualization, you will be focused on designing and developing compelling data visualizations solutions to enable actionable insights & facilitate intuitive information consumption for internal business stakeholders. The ideal candidate will demonstrate competency in building user-centric visuals & dashboards that empower stakeholders with data driven insights & decision-making capability. Responsibilities: Develop user-centric scalable visualization and analytical solutions, leveraging complex data sources to create intuitive and insightful dashboards. Apply best practices in visualization design to enhance user experience and business impact. Drive business engagement, collaborating with stakeholders to define key metrics, KPIs, and reporting needs. Facilitate workshops to develop user stories, wireframes, and interactive visualizations. Partner with data engineering, data science, and IT teams to develop scalable business-friendly reporting solutions. Ensure adherence to data governance, privacy, and security best practices. Identify opportunities for automation, streamlining manual reporting processes through modern visualization technologies and self-service analytics enablement. Provide thought leadership, driving knowledge-sharing within the Data & Analytics organization while staying ahead of industry trends to enhance visualization capabilities. Continuously innovative on visualization best practices & technologies by reviewing external resources & marketplace Ensuring timely delivery of high-quality outputs, while creating and maintaining SOPs, KPI libraries, and other essential governance documents. Required Experience and skills: 5+ years of experience in business intelligence, insight generation, business analytics, data visualization, infographics, and interactive visual storytelling Hands-on expertise in BI and visualization tools such as Power BI, MicroStrategy, and ThoughtSpot Solid understanding of data engineering and modeling, including ETL workflows, Dataiku, Databricks, Informatica, and database technologies like Redshift and Snowflake, with programming skills in SQL and Python. Deep knowledge of pharma commercial data sources, including IQVIA, APLD, Claims, Payer, Salesforce, Financials, Veeva, Komodo, IPSOS, and other industry datasets to drive strategic insights. Experience in pharmaceutical commercial analytics, including Field Force Effectiveness, customer engagement, market performance assessment, as well as web, campaign, and digital engagement analytics. Strong problem-solving, communication, and project management skills, with the ability to translate complex data into actionable insights and navigate complex matrix environments efficiently. Strong product management mindset, ensuring analytical solutions are scalable, user-centric, and aligned with business needs, with experience in defining product roadmaps and managing solution lifecycles. Expertise in agile ways of working, including Agile/Scrum methodologies, iterative development, and continuous improvement in data visualization and analytics solutions.

Posted 1 month ago

Apply

2.0 - 5.0 years

2 - 5 Lacs

Pune, Maharashtra, India

On-site

As an Associate Specialist in Data Visualization, you will be developing compelling data visualizations solutions to enable actionable insights & facilitate intuitive information consumption for internal business stakeholders. The ideal candidate will demonstrate competency in building user-centric visuals & dashboards that empower stakeholders with data driven insights & decision-making capability. Responsibilities: Develop user-centric scalable visualization and analytical solutions, leveraging complex data sources to create intuitive and insightful dashboards. Apply best practices in visualization design to enhance user experience and business impact. Collaborating with Data Viz leads and stakeholders to define key metrics, KPIs, and reporting needs. Participate and contribute in workshops to develop user stories, wireframes, and interactive visualizations. Ensuring timely delivery of high-quality outputs, while creating and maintaining SOPs, KPI libraries, and other essential governance documents. Partner with data engineering, data science, and IT teams to develop scalable business-friendly reporting solutions. Ensure adherence to data governance, privacy, and security best practices. Identify opportunities for automation, streamlining manual reporting processes through modern visualization technologies and self-service analytics enablement. Continuously innovative on visualization best practices & technologies by reviewing external resources & marketplace Required Experience and skills: 2+ years of experience in business intelligence, insight generation, business analytics, data visualization, infographics, and interactive visual storytelling Hands-on expertise in BI and visualization tools such as Power BI, PowerApps, Qliksense, MicroStrategy, and ThoughtSpot Experience of data engineering and modeling, including ETL workflows, Dataiku, Databricks, Informatica, and database technologies like Redshift and Snowflake, with programming skills in SQL and Python. Knowledge of pharma commercial data sources, including IQVIA, APLD, Claims, Payer, Salesforce, Financials, Veeva, Komodo, IPSOS, and other industry datasets to drive strategic insights is plus Experience in pharmaceutical commercial analytics, including Field Force Effectiveness, customer engagement, market performance assessment, as we'll as web, campaign, and digital engagement analytics is plus Experience in agile ways of working, including Agile/Scrum methodologies, iterative development, and continuous improvement in data visualization and analytics solutions. Good problem-solving and communication skills, with the ability to interpret data and provide actionable insights while effectively working in cross-functional environments. Basic understanding of product management principles, focusing on developing user-centric, scalable analytical solutions aligned with business needs. Familiarity with agile ways of working, including exposure to Agile/Scrum methodologies and participation in iterative development and continuous improvement initiatives in data visualization and analytics projects. Strong learning agility, with the ability to quickly adapt to new tools, technologies, and business environments while continuously enhancing analytical and technical skills. Required Skills: Business Intelligence (BI), Data Visualization, Requirements Management, User Experience (UX) Design

Posted 1 month ago

Apply

2.0 - 5.0 years

2 - 5 Lacs

Hyderabad, Telangana, India

On-site

As an Associate Specialist in Data Visualization, you will be developing compelling data visualizations solutions to enable actionable insights facilitate intuitive information consumption for internal business stakeholders. The ideal candidate will demonstrate competency in building user-centric visuals dashboards that empower stakeholders with data driven insights decision-making capability. Responsibilities: Develop user-centric scalable visualization and analytical solutions, leveraging complex data sources to create intuitive and insightful dashboards. Apply best practices in visualization design to enhance user experience and business impact. Collaborating with Data Viz leads and stakeholders to define key metrics, KPIs, and reporting needs. Participate and contribute in workshops to develop user stories, wireframes, and interactive visualizations. Ensuring timely delivery of high-quality outputs, while creating and maintaining SOPs, KPI libraries, and other essential governance documents. Partner with data engineering, data science, and IT teams to develop scalable business-friendly reporting solutions. Ensure adherence to data governance, privacy, and security best practices. Identify opportunities for automation, streamlining manual reporting processes through modern visualization technologies and self-service analytics enablement. Continuously innovative on visualization best practices technologies by reviewing external resources marketplace Required Experience and skills: 2+ years of experience in business intelligence, insight generation, business analytics, data visualization, infographics, and interactive visual storytelling Hands-on expertise in BI and visualization tools such as Power BI, PowerApps, Qliksense, MicroStrategy, and ThoughtSpot Experience of data engineering and modeling, including ETL workflows, Dataiku, Databricks, Informatica, and database technologies like Redshift and Snowflake, with programming skills in SQL and Python. Knowledge of pharma commercial data sources, including IQVIA, APLD, Claims, Payer, Salesforce, Financials, Veeva, Komodo, IPSOS, and other industry datasets to drive strategic insights is plus Experience in pharmaceutical commercial analytics, including Field Force Effectiveness, customer engagement, market performance assessment, as well as web, campaign, and digital engagement analytics is plus Experience in agile ways of working, including Agile/Scrum methodologies, iterative development, and continuous improvement in data visualization and analytics solutions. Good problem-solving and communication skills, with the ability to interpret data and provide actionable insights while effectively working in cross-functional environments. Basic understanding of product management principles, focusing on developing user-centric, scalable analytical solutions aligned with business needs. Familiarity with agile ways of working, including exposure to Agile/Scrum methodologies and participation in iterative development and continuous improvement initiatives in data visualization and analytics projects. Strong learning agility, with the ability to quickly adapt to new tools, technologies, and business environments while continuously enhancing analytical and technical skills.

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

delhi

On-site

The ideal candidate should possess extensive expertise in SQL, data modeling, ETL/ELT pipeline development, and cloud-based data platforms like Databricks or Snowflake. You will be responsible for designing scalable data models, managing reliable data workflows, and ensuring the integrity and performance of critical financial datasets. Collaboration with engineering, analytics, product, and compliance teams is a key aspect of this role. Responsibilities: - Design, implement, and maintain logical and physical data models for transactional, analytical, and reporting systems. - Develop and oversee scalable ETL/ELT pipelines to process large volumes of financial transaction data. - Optimize SQL queries, stored procedures, and data transformations for enhanced performance. - Create and manage data orchestration workflows using tools like Airflow, Dagster, or Luigi. - Architect data lakes and warehouses utilizing platforms such as Databricks, Snowflake, BigQuery, or Redshift. - Ensure adherence to data governance, security, and compliance standards (e.g., PCI-DSS, GDPR). - Work closely with data engineers, analysts, and business stakeholders to comprehend data requirements and deliver solutions. - Conduct data profiling, validation, and quality assurance to maintain clean and consistent data. - Maintain comprehensive documentation for data models, pipelines, and architecture. Required Skills & Qualifications: - Proficiency in advanced SQL, including query tuning, indexing, and performance optimization. - Experience in developing ETL/ELT workflows with tools like Spark, dbt, Talend, or Informatica. - Familiarity with data orchestration frameworks such as Airflow, Dagster, Luigi, etc. - Hands-on experience with cloud-based data platforms like Databricks, Snowflake, or similar technologies. - Deep understanding of data warehousing principles like star/snowflake schema, slowly changing dimensions, etc. - Knowledge of cloud services (AWS, GCP, or Azure) and data security best practices. - Strong analytical and problem-solving skills in high-scale environments. Preferred Qualifications: - Exposure to real-time data pipelines like Kafka, Spark Streaming. - Knowledge of data mesh or data fabric architecture paradigms. - Certifications in Snowflake, Databricks, or relevant cloud platforms. - Familiarity with Python or Scala for data engineering tasks.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Senior Database Developer at Kobie's India Tech Hub, you will have the opportunity to be one of the first hires in an exciting new venture. Kobie, a leading player in the loyalty industry, is expanding its global footprint by establishing a Tech Hub in India. This initiative aims to create deep connections with customers through personalized, data-driven loyalty experiences, further enhancing enterprise value through loyalty. Your role will involve designing scalable, data-driven solutions for high-impact loyalty platforms, leveraging your expertise in PL/pgSQL, efficient SQL queries, performance tuning, and ETL workflows. You will work with Oracle and/or PostgreSQL databases, handling complex data structures to support data integration, transformation, and analytics. As a key member of the team, you will be responsible for developing and maintaining database solutions that facilitate client onboarding, reward processing, data quality assurance, and operational performance. Collaboration with cross-functional teams such as developers, QA specialists, analysts, and DBAs will be essential to optimize data pipelines and queries, ensuring they meet the evolving needs of clients and marketing platforms. Your impact will be significant as you contribute to import and extract processes, data migration efforts, troubleshooting data quality and performance issues, tuning queries for optimal performance, supporting data integration from various sources, and providing technical assistance to stakeholders. Your ability to work both independently and collaboratively, manage priorities effectively, and communicate with technical and non-technical team members will be crucial for success. To excel in this role, you should have 5-7+ years of experience in SQL query design and maintenance, proficiency in Oracle and/or PostgreSQL, expertise in performance tuning, ETL development, data security, and a track record of working in team environments. Bonus skills include experience with data mapping tools, modern cloud platforms like Snowflake, job scheduling automation, version control systems, and supporting Java-based development teams. At Kobie, known for its award-winning culture and innovative loyalty solutions, you will be part of a collaborative and growth-focused environment. As a trusted partner to global brands, Kobie focuses on building lasting emotional connections with consumers through strategy-led technology solutions. The launch of the India Tech Hub presents an exciting opportunity to be part of a culture that values diversity, equity, inclusion, and giving back to the community. Joining Kobie means access to competitive benefits, comprehensive health coverage, well-being perks, flexible time off, and opportunities for career growth. The integration of new teammates in India with U.S. teams, exposure to global projects, and the future establishment of a physical office in Bengaluru emphasize Kobie's commitment to collaboration and connection. This is your chance to be part of something significant and shape the future of the Kobie India Tech Hub. Apply now and contribute to delivering innovative customer experiences for renowned brands while working alongside industry leaders in the loyalty space.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

As a Senior JEDOX Developer at Siemens Energy, your primary responsibility will involve working closely with global business users to address tickets submitted via SharePoint or Mailbox. You will collaborate with IT development and middleware teams to identify and implement solutions aligned with agreed operation and service level agreements. Additionally, you will play a key role in the monthly closing process, ensuring data accuracy and coordinating with end users. Attending sprint development meetings and engaging with collaborators and senior management will be essential to your role, helping you expand your network and prepare for future global responsibilities within Siemens Energy. Your impact will be significant as you lead the design, development, and implementation of data pipelines and ETL workflows. You will be tasked with managing and optimizing workflows for efficient data processing, designing data solutions in databases, and proactively developing reports with minimal documented requirements. Collaborating with cross-functional teams to translate requirements into scalable data architecture and fostering continuous improvement and innovation will be key aspects of your role. To excel in this position, you should have at least 6 years of experience in IT, preferably with a background in Engineering or a related field. Your expertise should include 4+ years of experience in ETL workflows, data analytics, reporting tools like Power BI and Tableau, and working with cloud databases such as SNOWFLAKE. Familiarity with EPM tools like JEDOX, ANAPLAN, or TM1, multidimensional database concepts, Power Automate workflows, and Excel formulas will be advantageous. Your ability to adapt to new technologies and thrive in a fast-paced environment, collaborate effectively with business users, and stay informed about industry trends are essential qualities for this role. Joining the Value Center Manufacturing team at Siemens Energy means being part of a dynamic group focused on driving digital transformation in manufacturing. You will contribute to innovative projects that impact the business and industry, playing a vital role in achieving Siemens Energy's objectives. The Digital Core team supports Business Areas by delivering top-notch IT, Strategy & Technology solutions. Siemens Energy is a global energy technology company with a diverse workforce committed to sustainable and reliable energy solutions. Our emphasis on diversity fuels our creativity and innovation, allowing us to harness the power of inclusion across over 130 nationalities. At Siemens Energy, we prioritize decarbonization, new technologies, and energy transformation to drive positive change in the energy sector. As a Siemens Energy employee, you will enjoy benefits such as Medical Insurance coverage for yourself and eligible family members, including a Family floater cover. Additionally, you will have the option to opt for a Meal Card as part of your CTC, providing tax-saving benefits as per company policy. Siemens Energy is dedicated to creating a supportive and inclusive work environment where individuals from all backgrounds can thrive and contribute to our shared success. Join us in shaping the future of energy and making a meaningful impact on society.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Data Engineer, you will be responsible for designing, developing, and delivering ADF pipelines for the Accounting & Reporting Stream. Your role will involve creating and maintaining scalable data pipelines using PySpark and ETL workflows in Azure Databricks and Azure Data Factory. You will also work on data modeling and architecture to optimize data structures for analytics and business requirements. Your responsibilities will include monitoring, tuning, and troubleshooting pipeline performance for efficiency and reliability. Collaboration with business analysts and stakeholders is key to understanding data needs and delivering actionable insights. Implementing data governance practices to ensure data quality, security, and compliance with regulations is essential. You will also be required to develop and maintain documentation for data pipelines and architecture. Experience in testing and test automation is necessary for this role. Collaboration with cross-functional teams to comprehend data requirements and provide technical advice is crucial. Strong background in data engineering is required, with proficiency in SQL, Azure Databricks, Blob Storage, Azure Data Factory, and programming languages like Python or Scala. Knowledge of Logic App and Key Vault is also necessary. Strong problem-solving skills and the ability to communicate complex technical concepts to non-technical stakeholders are essential for effective communication within the team.,

Posted 1 month ago

Apply

4.0 - 10.0 years

0 Lacs

delhi

On-site

As an Ab Initio Developer with over 10 years of experience, your primary responsibility will be to develop and optimize ETL workflows and processes for data extraction, transformation, and loading. You should have a minimum of 4 years of hands-on experience in Ab Initio development and proficiency in different Ab Initio suite components. Your role will involve designing and implementing ETL processes for large-scale data sets, utilizing your strong understanding of ETL concepts, data warehousing principles, and relational databases. To excel in this role, you must possess solid knowledge of SQL and scripting languages for data manipulation and analysis. Your problem-solving skills will be crucial as you work independently or collaboratively within a team. Effective communication is key, as you will be interacting with stakeholders at various levels to ensure the successful execution of projects. Additionally, you will be responsible for performing unit testing, debugging, and troubleshooting of Ab Initio graphs and applications. This is essential to guaranteeing the accuracy and integrity of the data throughout the development process. If you are someone who thrives in a dynamic environment, has a passion for data transformation, and enjoys tackling complex challenges, this role offers you the opportunity to leverage your expertise and contribute significantly to the success of projects.,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

kochi, kerala

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of yourself. EY is counting on your unique voice and perspective to help them become even better. Join EY to build an exceptional experience for yourself and contribute to creating a better working world for all. The Capacity Analyst role involves developing and implementing capacity plans for complex IT services, applications, and infrastructure components to ensure efficient use of IT Infrastructure to cost-effectively meet business needs. The position collaborates across the EY Technology organization to incorporate capacity management principles into the design and management of network/services/applications. Key Responsibilities: - Develop complex service capacity forecasts and scalability plans using production system metrics, performance test results, and business requirements. - Ensure efficient utilization of IT infrastructure to meet capacity and performance requirements for services and applications. - Work with service delivery organizations in EY Technology to develop business forecasts and incorporate them into service and component capacity plans. - Define component resource requirements for complex services/applications and develop scalability plans. - Monitor service capacity performance for complex applications/products/services and components to proactively avoid incidents/problems. - Perform problem determination of capacity/performance issues and recommend solutions. - Oversee the Capacity Management process for complex services and components. - Monitor capacity process compliance, recommend and implement process improvements. Skills and Attributes for Success: - Comprehensive understanding of Generative AI technologies and frameworks. - Extensive experience in data migration processes and Azure Data Factory. - Knowledge of virtualization technologies, MS SQL, Network, Server Hardware, Storage, and Cloud technologies. - Experience in Capacity Modelling and data analysis/reporting tools. - Understanding of Capacity Management methodologies based on ITIL principles. - Knowledge of global server architectures and planning for systems over multiple global locations. Qualifications: - Bachelor's degree in a related discipline or equivalent work experience. - 2-5 years of experience in Data Science with a focus on Data Engineering and Generative AI. - Strong problem-solving, communication, interpersonal, and analytic skills. - Knowledge of the IT Process Landscape. - Excellent English language skills. - Ability to adapt to changing priorities and work with diverse teams. - Certification in ITIL V3 Foundations or equivalent experience. Desired Skills: - Excellent communication and negotiation skills. - Flexibility to adjust to changing demands and work with diverse teams. - Strong teamwork, collaboration, documentation, and analytics skills. What We Offer: EY provides a highly integrated, global team environment with opportunities for growth and career development. The benefits package focuses on physical, emotional, financial, and social well-being. Continuous learning, transformative leadership, and a diverse and inclusive culture are key aspects of working at EY. EY is committed to building a better working world by creating long-term value for clients, people, and society. With diverse teams in over 150 countries, EY provides trust through assurance and helps clients grow, transform, and operate across various services. Please note that the role may require working outside normal hours and travel, both domestically and internationally, given its global focus and responsibilities.,

Posted 1 month ago

Apply
Page 1 of 3
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies