Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
7 - 10 Lacs
Chennai
On-site
Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. Fiche de poste : Job Summary: We are seeking a skilled and proactive Site Reliability Engineer (SRE) with 5–8 years of experience and deep expertise in Google Cloud Platform (GCP) . The ideal candidate will be responsible for the reliability, availability, and performance of cloud-based applications and infrastructure. You will collaborate with development, operations, and security teams to build and maintain scalable, secure, and highly available systems. Key Responsibilities: Design, develop, and maintain reliable, scalable, and highly available systems on GCP. Build and manage CI/CD pipelines , infrastructure as code (IaC), and monitoring solutions. Proactively monitor and manage system performance, uptime, and capacity using observability tools. Troubleshoot and resolve infrastructure and application-level issues in real-time. Implement and maintain disaster recovery , failover mechanisms , and backup strategies . Automate repetitive tasks and processes to improve efficiency and reduce toil . Participate in on-call rotations , incident management, and root cause analysis (RCA). Ensure compliance with security standards, privacy regulations, and governance policies . Collaborate with cross-functional teams to support DevOps and SRE best practices . Drive improvements in SLAs, SLOs, and error budgets through data-driven insights. Required Qualifications: 5–8 years of relevant experience as an SRE, DevOps Engineer, or Cloud Infrastructure Engineer. Strong hands-on experience with Google Cloud Platform (GCP) – Compute Engine, GKE, Cloud Functions, Cloud Storage, IAM, BigQuery, etc. Proficiency in Infrastructure as Code tools like Terraform , Deployment Manager , or CloudFormation . Experience with Kubernetes , Docker , and container orchestration. Proficiency in scripting languages like Python , Shell , or Go . Deep understanding of monitoring and logging tools such as Prometheus , Grafana , Stackdriver , or Datadog . Knowledge of CI/CD tools such as Jenkins, GitLab CI, or Cloud Build. Experience with incident response , postmortem analysis , and site reliability principles . Strong problem-solving and communication skills. Preferred Qualifications: GCP certifications (e.g., Professional Cloud DevOps Engineer , Cloud Architect ). Exposure to multi-cloud environments or hybrid cloud infrastructure. Familiarity with Agile and ITIL frameworks. Experience working in regulated environments with compliance standards (e.g., ISO, SOC2). Type de contrat: en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés.
Posted 4 weeks ago
0 years
8 - 9 Lacs
Chennai
On-site
Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. Fiche de poste : About UPS UPS is a global leader in logistics, offering a broad range of solutions that include transportation, distribution, supply chain management, and e-commerce. Founded in 1907, UPS operates in over 220 countries and territories, delivering packages and providing specialized services worldwide. Our mission is to enable commerce by connecting people, places, and businesses, with a strong focus on sustainability and innovation. About UPS Supply Chain Symphony™ The UPS Supply Chain Symphony™ platform is a cloud-based solution that seamlessly integrates key supply chain components, including shipping, warehousing, and inventory management, into a unified platform. This solution empowers businesses by offering enhanced visibility, advanced analytics, and customizable dashboards to streamline global supply chain operations and decision-making. About the role We are seeking Data Developer to join our data engineering team responsible for building and maintaining complex data solutions using Azure Data Factory (ADF), Azure Databricks , and Cosmos DB . The role involves designing and developing scalable data pipelines, implementing data transformations, and ensuring high data quality and performance. Work closely with data architects, testers, and analysts to deliver robust data solutions that support strategic business initiatives. The ideal candidate should possess deep expertise in big data technologies, data integration, and cloud-native data engineering solutions on Microsoft Azure. Primary Skills Data Engineering: Azure Data Factory (ADF), Azure Databricks. Cloud Platform: Microsoft Azure (Data Lake Storage, Cosmos DB). Data Modeling: NoSQL data modeling, Data warehousing concepts. Performance Optimization: Data pipeline performance tuning and cost optimization. Programming Languages: Python, SQL, PySpark Secondary Skills DevOps and CI/CD: Azure DevOps, CI/CD pipeline design and automation. Security and Compliance: Implementing data security and governance standards. Agile Methodologies: Experience in Agile/Scrum environments. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Educational Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications in Azure and Data Engineering, such as: Microsoft Certified: Azure Data Engineer Associate Microsoft Certified: Azure Solutions Architect Expert Databricks Certified Data Engineer Associate or Professional Type de contrat: en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés.
Posted 4 weeks ago
60.0 years
0 Lacs
Noida
On-site
It has been more than 60 years since SYSTRA has garnered expertise that spans the entire spectrum of Mass Rapid Transit System. SYSTRA India’s valuable presence in India roots back to 1957, where SYSTRA worked on the electrification of Indian Railways. Our technical excellence, holistic approach and the tremendous talent provides a career that puts people who join us at the heart of improving transportation and urban infrastructure efficiency. Understand better who we are by visiting www.systra.in CONTEXT Since the early 1990s, SYSTRA group has established a strong presence in Bangladesh, contributing to the successful completion of a wide range of complex railway, metro, highway and bridge projects for various Government Organizations and funding agencies. The multi-disciplinary services provided by SYSTRA in all projects comprising of feasibility study, design and engineering, bid documentation, tendering services, environment and social safeguards study, construction supervision and project management consultancy. The services also extended to business planning, policy reform, institutional and organizational restructuring. MISSIONS/MAIN DUTIES Support Business Development: Collaborate closely with the Business Development team to design and refine professional draft materials for client presentations, tender proposals, and pitch documents. Develop Marketing Collateral: Create visually compelling brochures, infographics, case studies, event assets, and other marketing materials that resonate with various audiences. Annual Reports & Corporate Communications: Design and layout company annual reports, executive briefs, and visually cohesive drafts for both internal and external communications. Presentations: Assist with the development and visual enhancement of corporate presentations and stakeholder decks, ensuring clarity and professionalism. Visual Storytelling: Translate complex infrastructure and business concepts into clear, accessible, and engaging graphics and illustrations that enhance our messaging. Brand Consistency: Ensure all design outputs adhere to established brand guidelines, maintaining consistency of visual identity across all channels and materials. Template & Asset Management: Maintain and evolve a library of design templates, graphical assets, and branded content for efficient and consistent use. Digital Content Support: Contribute to digital campaigns, including social media visuals, newsletters, and website graphics as required. PROFILE/SKILLS Mandatory: Education: Bachelor’s degree in Graphic Design, Visual Communication, or a related discipline. Experience: At least 5 years of hands-on graphic design experience, ideally supporting business development, marketing, or communications in infrastructure, engineering, or B2B sectors. Technical Skills: Mastery of Adobe Creative Suite (InDesign, Illustrator, Photoshop) for creating detailed reports, proposals, and marketing materials. Strong command of presentation software (MS PowerPoint, Keynote); able to enhance basic slides into compelling visual stories. Proficiency with MS Office Suite (Word, Excel) for layout and integration of graphics in various documents. Portfolio: A diverse portfolio demonstrating expertise in: Designing corporate presentations, reports, and proposals. Creating infographics, visual summaries, and marketing collateral. Translating complex information into clear, engaging visuals. Design Aptitude: Deep understanding of layout, composition, visual hierarchy, color theory, typography, and branding—ensuring clarity and brand consistency across all deliverables. Project Management: Proven ability to manage multiple projects simultaneously, meet deadlines, and prioritize requests from different teams. Detail-Oriented: Exceptional attention to accuracy, completeness, and polish in all visual outputs. Communication & Collaboration: Strong interpersonal skills with the ability to clearly communicate design ideas and work effectively with cross-functional teams (business development, engineering, communications, leadership). Preferred: Digital & Web: Familiarity with digital content creation tools (Canva, Figma, experience with web graphics) for social media and online campaigns. Proposal/Bid Experience: Previous work with proposal, bid, or business development teams, especially in consultancy or technical fields. Content Management: Experience developing and maintaining content libraries, templates, brand assets, and documentation for team efficiency. Interest in Industry Trends: Awareness of sustainable design practices and innovation within the transportation, infrastructure, or engineering sectors. We commit to put people who join us at the heart of improving transportation and urban infrastructure efficiency. As we are growing, this is time to be a part of this challenging adventure. It’s not a job - it’s a career! DÉTAIL DE L’ANNONCE Pays/Région : Inde Localisation : Noida Domaine : Développement Commercial, Offres, Marketing & Communication Type de contrat : CDI Niveau d'expérience : 5-10 ans
Posted 4 weeks ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are looking for a seasoned and strategic-thinking Senior AWS DataOps Engineer to join our growing global data team. In this role, you will take ownership of critical data workflows and work closely with cross-functional teams to support, optimize, and scale cloud-based data pipelines. You will bring leadership to data operations, contribute to architectural decisions, and help ensure the integrity, availability, and performance of our AWS data infrastructure. Your Key Responsibilities Lead the design, monitoring, and optimization of AWS-based data pipelines using services like AWS Glue, EMR, Lambda, and Amazon S3. Oversee and enhance complex ETL workflows involving IICS (Informatica Intelligent Cloud Services), Databricks, and native AWS tools. Collaborate with data engineering and analytics teams to streamline ingestion into Amazon Redshift and lead data validation strategies. Manage job orchestration using Apache Airflow, AWS Data Pipeline, or equivalent tools, ensuring SLA adherence. Guide SQL query optimization across Redshift and other AWS databases for analytics and operational use cases. Perform root cause analysis of critical failures, mentor junior staff on best practices, and implement preventive measures. Lead deployment activities through robust CI/CD pipelines, applying DevOps principles and automation. Own the creation and governance of SOPs, runbooks, and technical documentation for data operations. Partner with vendors, security, and infrastructure teams to ensure compliance, scalability, and cost-effective architecture. Skills And Attributes For Success Expertise in AWS data services and ability to lead architectural discussions. Analytical thinker with the ability to design and optimize end-to-end data workflows. Excellent debugging and incident resolution skills in large-scale data environments. Strong leadership and mentoring capabilities, with clear communication across business and technical teams. A growth mindset with a passion for building reliable, scalable data systems. Proven ability to manage priorities and navigate ambiguity in a fast-paced environment. To qualify for the role, you must have 5–8 years of experience in DataOps, Data Engineering, or related roles. Strong hands-on expertise in Databricks. Deep understanding of ETL pipelines and modern data integration patterns. Proven experience with Amazon S3, EMR, Glue, Lambda, and Amazon Redshift in production environments. Experience in Airflow or AWS Data Pipeline for orchestration and scheduling. Advanced knowledge of IICS or similar ETL tools for data transformation and automation. SQL skills with emphasis on performance tuning, complex joins, and window functions. Technologies and Tools Must haves Proficient in Amazon S3, EMR (Elastic MapReduce), AWS Glue, and Lambda Expert in Databricks – ability to develop, optimize, and troubleshoot advanced notebooks Strong experience with Amazon Redshift for scalable data warehousing and analytics Solid understanding of orchestration tools like Apache Airflow or AWS Data Pipeline Hands-on with IICS (Informatica Intelligent Cloud Services) or comparable ETL platforms Good to have Exposure to Power BI or Tableau for data visualization Familiarity with CDI, Informatica, or other enterprise-grade data integration platforms Understanding of DevOps and CI/CD automation tools for data engineering workflows SQL familiarity across large datasets and distributed databases What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 4 weeks ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. Fiche De Poste Job Description: Monitoring and analyzing accounting data and produce financial reports or statements Establishing and enforcing proper accounting methods, policies and principles Manage and oversee the daily operations of the accounting department including: month and end-year process, general ledger treasury, budgeting, cash forecasting analysis field revenue and expenditure variance capital assets reconciliations fixed asset activity Coordinate and complete annual audits Inter-company settlement, recon GST compliance Provide recommendations Improve systems and procedures and initiate corrective actions Assign projects and direct staff to ensure compliance and accuracy Meet financial accounting objectives Establish and maintain fiscal files and records to document transactions Staff Management (Career Development, staffing, performance management) What You’ll Get A competitive salary package & excellent benefits as per industry standard An international work environment with opportunities to progress and grow thanks to our ‘promotion from within policy’. What You’ll Need CA Degree Experience in Statutory audit, accounting compliance, GST, transfer pricing About UPS Founded in 1907 as a messenger company in the United States, UPS has grown into a multi-billion-dollar corporation by clearly focusing on the goal of enabling commerce around the globe. Today, UPS is a global company with one of the most recognized and admired brands in the world. We have become the world's largest package delivery company and a leading global provider of specialized transportation and logistics services. Every day, we manage the flow of goods, funds, and information in more than 200 countries and territories worldwide. Excited about this challenge? Apply now as an Account / Territory Manager at UPS! UPS is committed to providing a workplace free of discrimination, harassment, and retaliation. Type De Contrat en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés.
Posted 4 weeks ago
6.0 years
0 Lacs
Kanayannur, Kerala, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a highly skilled and motivated Senior DataOps Engineer with strong expertise in the Azure data ecosystem. You will play a crucial role in managing and optimizing data workflows across Azure platforms such as Azure Data Factory, Data Lake, Databricks, and Synapse. Your primary focus will be on building, maintaining, and monitoring data pipelines, ensuring high data quality, and supporting critical data operations. You'll also support visualization, automation, and CI/CD processes to streamline data delivery and reporting. Your Key Responsibilities Data Pipeline Management: Build, monitor, and optimize data pipelines using Azure Data Factory (ADF), Databricks, and Azure Synapse for efficient data ingestion, transformation, and storage. ETL Operations: Design and maintain robust ETL processes for batch and real-time data processing across cloud and on-premise sources. Data Lake Management: Organize and manage structured and unstructured data in Azure Data Lake, ensuring performance and security best practices. Data Quality & Validation: Perform data profiling, validation, and transformation using SQL, PySpark, and Python to ensure data integrity. Monitoring & Troubleshooting: Use logging and monitoring tools to troubleshoot failures in pipelines and address data latency or quality issues. Reporting & Visualization: Work with Power BI or Tableau teams to support dashboard development, ensuring the availability of clean and reliable data. DevOps & CI/CD: Support data deployment pipelines using Azure DevOps, Git, and CI/CD practices for version control and automation. Tool Integration: Collaborate with cross-functional teams to integrate Informatica CDI or similar ETL tools with Azure components for seamless data flow. Collaboration & Documentation: Partner with data analysts, engineers, and business stakeholders, while maintaining SOPs and technical documentation for operational efficiency. Skills And Attributes For Success Strong hands-on experience in Azure Data Factory, Azure Data Lake, Azure Synapse, and Databricks Solid understanding of ETL/ELT design and implementation principles Strong SQL and PySpark skills for data transformation and validation Exposure to Python for automation and scripting Familiarity with DevOps concepts, CI/CD workflows, and source control systems (Azure DevOps preferred) Experience in working with Power BI or Tableau for data visualization and reporting support Strong problem-solving skills, attention to detail, and commitment to data quality Excellent communication and documentation skills to interface with technical and business teamsStrong knowledge of asset management business operations, especially in data domains like securities, holdings, benchmarks, and pricing. To qualify for the role, you must have 4–6 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Azure Databricks: Experience in data transformation and processing using notebooks and Spark. Azure Data Lake: Experience working with hierarchical data storage in Data Lake. Azure Synapse: Familiarity with distributed data querying and data warehousing. Azure Data factory: Hands-on experience in orchestrating and monitoring data pipelines. ETL Process Understanding: Knowledge of data extraction, transformation, and loading workflows, including data cleansing, mapping, and integration techniques. Good to have Power BI or Tableau for reporting support Monitoring/logging using Azure Monitor or Log Analytics Azure DevOps and Git for CI/CD and version control Python and/or PySpark for scripting and data handling Informatica Cloud Data Integration (CDI) or similar ETL tools Shell scripting or command-line data SQL (across distributed and relational databases) What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 4 weeks ago
8.0 years
0 Lacs
Kanayannur, Kerala, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are looking for a seasoned and strategic-thinking Senior AWS DataOps Engineer to join our growing global data team. In this role, you will take ownership of critical data workflows and work closely with cross-functional teams to support, optimize, and scale cloud-based data pipelines. You will bring leadership to data operations, contribute to architectural decisions, and help ensure the integrity, availability, and performance of our AWS data infrastructure. Your Key Responsibilities Lead the design, monitoring, and optimization of AWS-based data pipelines using services like AWS Glue, EMR, Lambda, and Amazon S3. Oversee and enhance complex ETL workflows involving IICS (Informatica Intelligent Cloud Services), Databricks, and native AWS tools. Collaborate with data engineering and analytics teams to streamline ingestion into Amazon Redshift and lead data validation strategies. Manage job orchestration using Apache Airflow, AWS Data Pipeline, or equivalent tools, ensuring SLA adherence. Guide SQL query optimization across Redshift and other AWS databases for analytics and operational use cases. Perform root cause analysis of critical failures, mentor junior staff on best practices, and implement preventive measures. Lead deployment activities through robust CI/CD pipelines, applying DevOps principles and automation. Own the creation and governance of SOPs, runbooks, and technical documentation for data operations. Partner with vendors, security, and infrastructure teams to ensure compliance, scalability, and cost-effective architecture. Skills And Attributes For Success Expertise in AWS data services and ability to lead architectural discussions. Analytical thinker with the ability to design and optimize end-to-end data workflows. Excellent debugging and incident resolution skills in large-scale data environments. Strong leadership and mentoring capabilities, with clear communication across business and technical teams. A growth mindset with a passion for building reliable, scalable data systems. Proven ability to manage priorities and navigate ambiguity in a fast-paced environment. To qualify for the role, you must have 5–8 years of experience in DataOps, Data Engineering, or related roles. Strong hands-on expertise in Databricks. Deep understanding of ETL pipelines and modern data integration patterns. Proven experience with Amazon S3, EMR, Glue, Lambda, and Amazon Redshift in production environments. Experience in Airflow or AWS Data Pipeline for orchestration and scheduling. Advanced knowledge of IICS or similar ETL tools for data transformation and automation. SQL skills with emphasis on performance tuning, complex joins, and window functions. Technologies and Tools Must haves Proficient in Amazon S3, EMR (Elastic MapReduce), AWS Glue, and Lambda Expert in Databricks – ability to develop, optimize, and troubleshoot advanced notebooks Strong experience with Amazon Redshift for scalable data warehousing and analytics Solid understanding of orchestration tools like Apache Airflow or AWS Data Pipeline Hands-on with IICS (Informatica Intelligent Cloud Services) or comparable ETL platforms Good to have Exposure to Power BI or Tableau for data visualization Familiarity with CDI, Informatica, or other enterprise-grade data integration platforms Understanding of DevOps and CI/CD automation tools for data engineering workflows SQL familiarity across large datasets and distributed databases What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 4 weeks ago
8.0 years
0 Lacs
Trivandrum, Kerala, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are looking for a seasoned and strategic-thinking Senior AWS DataOps Engineer to join our growing global data team. In this role, you will take ownership of critical data workflows and work closely with cross-functional teams to support, optimize, and scale cloud-based data pipelines. You will bring leadership to data operations, contribute to architectural decisions, and help ensure the integrity, availability, and performance of our AWS data infrastructure. Your Key Responsibilities Lead the design, monitoring, and optimization of AWS-based data pipelines using services like AWS Glue, EMR, Lambda, and Amazon S3. Oversee and enhance complex ETL workflows involving IICS (Informatica Intelligent Cloud Services), Databricks, and native AWS tools. Collaborate with data engineering and analytics teams to streamline ingestion into Amazon Redshift and lead data validation strategies. Manage job orchestration using Apache Airflow, AWS Data Pipeline, or equivalent tools, ensuring SLA adherence. Guide SQL query optimization across Redshift and other AWS databases for analytics and operational use cases. Perform root cause analysis of critical failures, mentor junior staff on best practices, and implement preventive measures. Lead deployment activities through robust CI/CD pipelines, applying DevOps principles and automation. Own the creation and governance of SOPs, runbooks, and technical documentation for data operations. Partner with vendors, security, and infrastructure teams to ensure compliance, scalability, and cost-effective architecture. Skills And Attributes For Success Expertise in AWS data services and ability to lead architectural discussions. Analytical thinker with the ability to design and optimize end-to-end data workflows. Excellent debugging and incident resolution skills in large-scale data environments. Strong leadership and mentoring capabilities, with clear communication across business and technical teams. A growth mindset with a passion for building reliable, scalable data systems. Proven ability to manage priorities and navigate ambiguity in a fast-paced environment. To qualify for the role, you must have 5–8 years of experience in DataOps, Data Engineering, or related roles. Strong hands-on expertise in Databricks. Deep understanding of ETL pipelines and modern data integration patterns. Proven experience with Amazon S3, EMR, Glue, Lambda, and Amazon Redshift in production environments. Experience in Airflow or AWS Data Pipeline for orchestration and scheduling. Advanced knowledge of IICS or similar ETL tools for data transformation and automation. SQL skills with emphasis on performance tuning, complex joins, and window functions. Technologies and Tools Must haves Proficient in Amazon S3, EMR (Elastic MapReduce), AWS Glue, and Lambda Expert in Databricks – ability to develop, optimize, and troubleshoot advanced notebooks Strong experience with Amazon Redshift for scalable data warehousing and analytics Solid understanding of orchestration tools like Apache Airflow or AWS Data Pipeline Hands-on with IICS (Informatica Intelligent Cloud Services) or comparable ETL platforms Good to have Exposure to Power BI or Tableau for data visualization Familiarity with CDI, Informatica, or other enterprise-grade data integration platforms Understanding of DevOps and CI/CD automation tools for data engineering workflows SQL familiarity across large datasets and distributed databases What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 4 weeks ago
6.0 years
0 Lacs
Trivandrum, Kerala, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a highly skilled and motivated Senior DataOps Engineer with strong expertise in the Azure data ecosystem. You will play a crucial role in managing and optimizing data workflows across Azure platforms such as Azure Data Factory, Data Lake, Databricks, and Synapse. Your primary focus will be on building, maintaining, and monitoring data pipelines, ensuring high data quality, and supporting critical data operations. You'll also support visualization, automation, and CI/CD processes to streamline data delivery and reporting. Your Key Responsibilities Data Pipeline Management: Build, monitor, and optimize data pipelines using Azure Data Factory (ADF), Databricks, and Azure Synapse for efficient data ingestion, transformation, and storage. ETL Operations: Design and maintain robust ETL processes for batch and real-time data processing across cloud and on-premise sources. Data Lake Management: Organize and manage structured and unstructured data in Azure Data Lake, ensuring performance and security best practices. Data Quality & Validation: Perform data profiling, validation, and transformation using SQL, PySpark, and Python to ensure data integrity. Monitoring & Troubleshooting: Use logging and monitoring tools to troubleshoot failures in pipelines and address data latency or quality issues. Reporting & Visualization: Work with Power BI or Tableau teams to support dashboard development, ensuring the availability of clean and reliable data. DevOps & CI/CD: Support data deployment pipelines using Azure DevOps, Git, and CI/CD practices for version control and automation. Tool Integration: Collaborate with cross-functional teams to integrate Informatica CDI or similar ETL tools with Azure components for seamless data flow. Collaboration & Documentation: Partner with data analysts, engineers, and business stakeholders, while maintaining SOPs and technical documentation for operational efficiency. Skills And Attributes For Success Strong hands-on experience in Azure Data Factory, Azure Data Lake, Azure Synapse, and Databricks Solid understanding of ETL/ELT design and implementation principles Strong SQL and PySpark skills for data transformation and validation Exposure to Python for automation and scripting Familiarity with DevOps concepts, CI/CD workflows, and source control systems (Azure DevOps preferred) Experience in working with Power BI or Tableau for data visualization and reporting support Strong problem-solving skills, attention to detail, and commitment to data quality Excellent communication and documentation skills to interface with technical and business teamsStrong knowledge of asset management business operations, especially in data domains like securities, holdings, benchmarks, and pricing. To qualify for the role, you must have 4–6 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Azure Databricks: Experience in data transformation and processing using notebooks and Spark. Azure Data Lake: Experience working with hierarchical data storage in Data Lake. Azure Synapse: Familiarity with distributed data querying and data warehousing. Azure Data factory: Hands-on experience in orchestrating and monitoring data pipelines. ETL Process Understanding: Knowledge of data extraction, transformation, and loading workflows, including data cleansing, mapping, and integration techniques. Good to have Power BI or Tableau for reporting support Monitoring/logging using Azure Monitor or Log Analytics Azure DevOps and Git for CI/CD and version control Python and/or PySpark for scripting and data handling Informatica Cloud Data Integration (CDI) or similar ETL tools Shell scripting or command-line data SQL (across distributed and relational databases) What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 4 weeks ago
6.0 years
0 Lacs
Pune, Maharashtra, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a highly skilled and motivated Senior DataOps Engineer with strong expertise in the Azure data ecosystem. You will play a crucial role in managing and optimizing data workflows across Azure platforms such as Azure Data Factory, Data Lake, Databricks, and Synapse. Your primary focus will be on building, maintaining, and monitoring data pipelines, ensuring high data quality, and supporting critical data operations. You'll also support visualization, automation, and CI/CD processes to streamline data delivery and reporting. Your Key Responsibilities Data Pipeline Management: Build, monitor, and optimize data pipelines using Azure Data Factory (ADF), Databricks, and Azure Synapse for efficient data ingestion, transformation, and storage. ETL Operations: Design and maintain robust ETL processes for batch and real-time data processing across cloud and on-premise sources. Data Lake Management: Organize and manage structured and unstructured data in Azure Data Lake, ensuring performance and security best practices. Data Quality & Validation: Perform data profiling, validation, and transformation using SQL, PySpark, and Python to ensure data integrity. Monitoring & Troubleshooting: Use logging and monitoring tools to troubleshoot failures in pipelines and address data latency or quality issues. Reporting & Visualization: Work with Power BI or Tableau teams to support dashboard development, ensuring the availability of clean and reliable data. DevOps & CI/CD: Support data deployment pipelines using Azure DevOps, Git, and CI/CD practices for version control and automation. Tool Integration: Collaborate with cross-functional teams to integrate Informatica CDI or similar ETL tools with Azure components for seamless data flow. Collaboration & Documentation: Partner with data analysts, engineers, and business stakeholders, while maintaining SOPs and technical documentation for operational efficiency. Skills And Attributes For Success Strong hands-on experience in Azure Data Factory, Azure Data Lake, Azure Synapse, and Databricks Solid understanding of ETL/ELT design and implementation principles Strong SQL and PySpark skills for data transformation and validation Exposure to Python for automation and scripting Familiarity with DevOps concepts, CI/CD workflows, and source control systems (Azure DevOps preferred) Experience in working with Power BI or Tableau for data visualization and reporting support Strong problem-solving skills, attention to detail, and commitment to data quality Excellent communication and documentation skills to interface with technical and business teamsStrong knowledge of asset management business operations, especially in data domains like securities, holdings, benchmarks, and pricing. To qualify for the role, you must have 4–6 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Azure Databricks: Experience in data transformation and processing using notebooks and Spark. Azure Data Lake: Experience working with hierarchical data storage in Data Lake. Azure Synapse: Familiarity with distributed data querying and data warehousing. Azure Data factory: Hands-on experience in orchestrating and monitoring data pipelines. ETL Process Understanding: Knowledge of data extraction, transformation, and loading workflows, including data cleansing, mapping, and integration techniques. Good to have Power BI or Tableau for reporting support Monitoring/logging using Azure Monitor or Log Analytics Azure DevOps and Git for CI/CD and version control Python and/or PySpark for scripting and data handling Informatica Cloud Data Integration (CDI) or similar ETL tools Shell scripting or command-line data SQL (across distributed and relational databases) What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 4 weeks ago
8.0 years
0 Lacs
Pune, Maharashtra, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are looking for a seasoned and strategic-thinking Senior AWS DataOps Engineer to join our growing global data team. In this role, you will take ownership of critical data workflows and work closely with cross-functional teams to support, optimize, and scale cloud-based data pipelines. You will bring leadership to data operations, contribute to architectural decisions, and help ensure the integrity, availability, and performance of our AWS data infrastructure. Your Key Responsibilities Lead the design, monitoring, and optimization of AWS-based data pipelines using services like AWS Glue, EMR, Lambda, and Amazon S3. Oversee and enhance complex ETL workflows involving IICS (Informatica Intelligent Cloud Services), Databricks, and native AWS tools. Collaborate with data engineering and analytics teams to streamline ingestion into Amazon Redshift and lead data validation strategies. Manage job orchestration using Apache Airflow, AWS Data Pipeline, or equivalent tools, ensuring SLA adherence. Guide SQL query optimization across Redshift and other AWS databases for analytics and operational use cases. Perform root cause analysis of critical failures, mentor junior staff on best practices, and implement preventive measures. Lead deployment activities through robust CI/CD pipelines, applying DevOps principles and automation. Own the creation and governance of SOPs, runbooks, and technical documentation for data operations. Partner with vendors, security, and infrastructure teams to ensure compliance, scalability, and cost-effective architecture. Skills And Attributes For Success Expertise in AWS data services and ability to lead architectural discussions. Analytical thinker with the ability to design and optimize end-to-end data workflows. Excellent debugging and incident resolution skills in large-scale data environments. Strong leadership and mentoring capabilities, with clear communication across business and technical teams. A growth mindset with a passion for building reliable, scalable data systems. Proven ability to manage priorities and navigate ambiguity in a fast-paced environment. To qualify for the role, you must have 5–8 years of experience in DataOps, Data Engineering, or related roles. Strong hands-on expertise in Databricks. Deep understanding of ETL pipelines and modern data integration patterns. Proven experience with Amazon S3, EMR, Glue, Lambda, and Amazon Redshift in production environments. Experience in Airflow or AWS Data Pipeline for orchestration and scheduling. Advanced knowledge of IICS or similar ETL tools for data transformation and automation. SQL skills with emphasis on performance tuning, complex joins, and window functions. Technologies and Tools Must haves Proficient in Amazon S3, EMR (Elastic MapReduce), AWS Glue, and Lambda Expert in Databricks – ability to develop, optimize, and troubleshoot advanced notebooks Strong experience with Amazon Redshift for scalable data warehousing and analytics Solid understanding of orchestration tools like Apache Airflow or AWS Data Pipeline Hands-on with IICS (Informatica Intelligent Cloud Services) or comparable ETL platforms Good to have Exposure to Power BI or Tableau for data visualization Familiarity with CDI, Informatica, or other enterprise-grade data integration platforms Understanding of DevOps and CI/CD automation tools for data engineering workflows SQL familiarity across large datasets and distributed databases What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 4 weeks ago
6.0 years
0 Lacs
Noida, Uttar Pradesh, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a highly skilled and motivated Senior DataOps Engineer with strong expertise in the Azure data ecosystem. You will play a crucial role in managing and optimizing data workflows across Azure platforms such as Azure Data Factory, Data Lake, Databricks, and Synapse. Your primary focus will be on building, maintaining, and monitoring data pipelines, ensuring high data quality, and supporting critical data operations. You'll also support visualization, automation, and CI/CD processes to streamline data delivery and reporting. Your Key Responsibilities Data Pipeline Management: Build, monitor, and optimize data pipelines using Azure Data Factory (ADF), Databricks, and Azure Synapse for efficient data ingestion, transformation, and storage. ETL Operations: Design and maintain robust ETL processes for batch and real-time data processing across cloud and on-premise sources. Data Lake Management: Organize and manage structured and unstructured data in Azure Data Lake, ensuring performance and security best practices. Data Quality & Validation: Perform data profiling, validation, and transformation using SQL, PySpark, and Python to ensure data integrity. Monitoring & Troubleshooting: Use logging and monitoring tools to troubleshoot failures in pipelines and address data latency or quality issues. Reporting & Visualization: Work with Power BI or Tableau teams to support dashboard development, ensuring the availability of clean and reliable data. DevOps & CI/CD: Support data deployment pipelines using Azure DevOps, Git, and CI/CD practices for version control and automation. Tool Integration: Collaborate with cross-functional teams to integrate Informatica CDI or similar ETL tools with Azure components for seamless data flow. Collaboration & Documentation: Partner with data analysts, engineers, and business stakeholders, while maintaining SOPs and technical documentation for operational efficiency. Skills And Attributes For Success Strong hands-on experience in Azure Data Factory, Azure Data Lake, Azure Synapse, and Databricks Solid understanding of ETL/ELT design and implementation principles Strong SQL and PySpark skills for data transformation and validation Exposure to Python for automation and scripting Familiarity with DevOps concepts, CI/CD workflows, and source control systems (Azure DevOps preferred) Experience in working with Power BI or Tableau for data visualization and reporting support Strong problem-solving skills, attention to detail, and commitment to data quality Excellent communication and documentation skills to interface with technical and business teamsStrong knowledge of asset management business operations, especially in data domains like securities, holdings, benchmarks, and pricing. To qualify for the role, you must have 4–6 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Azure Databricks: Experience in data transformation and processing using notebooks and Spark. Azure Data Lake: Experience working with hierarchical data storage in Data Lake. Azure Synapse: Familiarity with distributed data querying and data warehousing. Azure Data factory: Hands-on experience in orchestrating and monitoring data pipelines. ETL Process Understanding: Knowledge of data extraction, transformation, and loading workflows, including data cleansing, mapping, and integration techniques. Good to have Power BI or Tableau for reporting support Monitoring/logging using Azure Monitor or Log Analytics Azure DevOps and Git for CI/CD and version control Python and/or PySpark for scripting and data handling Informatica Cloud Data Integration (CDI) or similar ETL tools Shell scripting or command-line data SQL (across distributed and relational databases) What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 4 weeks ago
8.0 years
0 Lacs
Noida, Uttar Pradesh, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are looking for a seasoned and strategic-thinking Senior AWS DataOps Engineer to join our growing global data team. In this role, you will take ownership of critical data workflows and work closely with cross-functional teams to support, optimize, and scale cloud-based data pipelines. You will bring leadership to data operations, contribute to architectural decisions, and help ensure the integrity, availability, and performance of our AWS data infrastructure. Your Key Responsibilities Lead the design, monitoring, and optimization of AWS-based data pipelines using services like AWS Glue, EMR, Lambda, and Amazon S3. Oversee and enhance complex ETL workflows involving IICS (Informatica Intelligent Cloud Services), Databricks, and native AWS tools. Collaborate with data engineering and analytics teams to streamline ingestion into Amazon Redshift and lead data validation strategies. Manage job orchestration using Apache Airflow, AWS Data Pipeline, or equivalent tools, ensuring SLA adherence. Guide SQL query optimization across Redshift and other AWS databases for analytics and operational use cases. Perform root cause analysis of critical failures, mentor junior staff on best practices, and implement preventive measures. Lead deployment activities through robust CI/CD pipelines, applying DevOps principles and automation. Own the creation and governance of SOPs, runbooks, and technical documentation for data operations. Partner with vendors, security, and infrastructure teams to ensure compliance, scalability, and cost-effective architecture. Skills And Attributes For Success Expertise in AWS data services and ability to lead architectural discussions. Analytical thinker with the ability to design and optimize end-to-end data workflows. Excellent debugging and incident resolution skills in large-scale data environments. Strong leadership and mentoring capabilities, with clear communication across business and technical teams. A growth mindset with a passion for building reliable, scalable data systems. Proven ability to manage priorities and navigate ambiguity in a fast-paced environment. To qualify for the role, you must have 5–8 years of experience in DataOps, Data Engineering, or related roles. Strong hands-on expertise in Databricks. Deep understanding of ETL pipelines and modern data integration patterns. Proven experience with Amazon S3, EMR, Glue, Lambda, and Amazon Redshift in production environments. Experience in Airflow or AWS Data Pipeline for orchestration and scheduling. Advanced knowledge of IICS or similar ETL tools for data transformation and automation. SQL skills with emphasis on performance tuning, complex joins, and window functions. Technologies and Tools Must haves Proficient in Amazon S3, EMR (Elastic MapReduce), AWS Glue, and Lambda Expert in Databricks – ability to develop, optimize, and troubleshoot advanced notebooks Strong experience with Amazon Redshift for scalable data warehousing and analytics Solid understanding of orchestration tools like Apache Airflow or AWS Data Pipeline Hands-on with IICS (Informatica Intelligent Cloud Services) or comparable ETL platforms Good to have Exposure to Power BI or Tableau for data visualization Familiarity with CDI, Informatica, or other enterprise-grade data integration platforms Understanding of DevOps and CI/CD automation tools for data engineering workflows SQL familiarity across large datasets and distributed databases What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 4 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About The Company Everest DX We are a Digital Platform Services company, headquartered in Stamford. Our Platform/Solution includes Orchestration, Intelligent operations with BOTs, AI-powered analytics for Enterprise IT. Our vision is to enable Digital Transformation for enterprises to deliver seamless customer experience, business efficiency and actionable insights through an integrated set of futuristic digital technologies. Digital Transformation Services Specialized in Design, Build, Develop, Integrate, and Manage cloud solutions and modernize Data centers, build a Cloud-native application and migrate existing applications into secure, multi-cloud environments to support digital transformation. Our Digital Platform Services enable organizations to reduce IT resource requirements and improve productivity, in addition to lowering costs and speeding digital transformation. Digital Platform Cloud Intelligent Management (CiM) An Autonomous Hybrid Cloud Management Platform that works across multi-cloud environments. helps enterprise Digital Transformation get most out of the cloud strategy while reducing Cost, Risk and Speed. Responsibilities Candidate should hands-on experience on ETL and SQL. Design, develop, and optimize ETL workflows using Informatica PowerCenter. Implement cloud-based ETL solutions using Informatica IDMC and IICS. Should have expertise on all transformations in Power Center and IDMC/IICS. Should have experience or knowledge on the PC to IICS migration using CDI PC tool or some other tool. Lead data migration projects, transitioning data from on-premise to cloud environments. Write complex SQL queries and perform data validation and transformation. Conduct detailed data analysis to ensure accuracy and integrity of migrated data. Troubleshoot and optimize ETL processes for performance and error handling. Collaborate with cross-functional teams to gather requirements and design solutions. Create and maintain documentation for ETL processes and system configurations. Implement industry best practices for data integration and performance tuning. Required Skills Hands-on experience with Informatica Power Center, IDMC and IICS. Strong expertise in writing complex SQL queries and database management. Experience in data migration projects (on-premise to cloud). Strong data analysis skills for large datasets and ensuring accuracy. Solid understanding of ETL design & development concepts. Familiarity with cloud platforms (AWS, Azure). Experience with version control tools (e.g., Git) and deployment processes. Preferred Skills Experience with data lakes, data warehousing, or big data platforms. Familiarity with Agile methodologies. Knowledge of other ETL tools. (ref:hirist.tech)
Posted 1 month ago
2.0 - 3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. Fiche De Poste Job Summary: - The Account Executive Sales Position focuses on growing the business and expanding the relationships with SBM customers for predefined territory. Our Account Executive is responsible for acquisition of new and developing Small and Medium customers with main objective of Revenue, Volume and Gross Profit Growth by delivering sustainable supply Chain Solutions. Designation Internal: Account Executive - Business Development External: Account Manager - Business Development Essential Functions Of The Role Business Development and acquisition of new large national and global customers in line with organization growth and go to market strategies. Focus on long term sustainable business strategies – RFQ and long-term contracts. Developing and implementing sound retention strategies, utilizing strong negotiation efforts to preserve business and securing contract agreements from previously noncontracted customers. Control revenue by maximizing profits through price strategies, margin control and mitigating customer loss. Presenting complete portfolio of UPS in front of customers to penetrate revenue and volumes streams. Achieve assigned monthly, quarterly and yearly Sales Goals – Customer’s acquisition, Volume growth, Revenue Growth and Gross profit. Timely update of Sales activities in UPS Drive Sales system and following Sales and organizational policies. Reporting to all the aspect of Sales activities and customers to manager on daily basis. Collaboration with Pricing/ procurement/ ops/ network to design solutions for customers. Key Skill Sets/Competencies Professional Selling Skills/Consultative Selling techniques and impactful presentation and communication skills. Negotiation and Objection Handling Skills for maintenance and sustainability. Strategic Thinking and Relationship Building Self-motivated and result oriented. Demonstrate Adaptability and accountability Use Ethical Practices Collaborative and interpersonal skills, networking and uses Experts/ Tools – Drive, Sales navigator and other systems to help in leveraging customer relationship. Hands on technologies to use the Sales and solution tools. Ownership and Constant Initiatives in adversity, - Uses Planning activities to achieve assigned goals. Job Duties Pre Selling: Travels to customer sites for face-to-face meetings to gather information about their businesses and identify opportunities for solutions Customizes standardized presentation templates with customers’ information to illustrate benefits of solutions to customers Analyzes customers’ billing technology to understand their needs and recommend UPS products and services Performs pre-call analysis (e.g., research account history, shipping details, complaints, etc.) to prepare for sales calls. Research resources (e.g., current customers, periodicals, competitors, etc.) to identify sales opportunities and obtain contact information Selling Responds to internal sales leads from various sources (e.g., Sales Lead Incentive Management system, operations, Package Operations staff, etc.) to identify sales opportunities and create a sales strategy Sells UPS suite of technology solutions (e.g., Trade Ability, Quantum View, and Campus Ship, etc.) to customers to secure their business Assesses previous sales calls to determine action plans for subsequent visits Research existing UPS account history to obtain background information (e.g., pay history, shipping routes, etc.) and identify and prioritize large sales opportunities Utilizes DRIVE to document customer information and provide account status to the sales team Maintains and monitors records of customer information and account performance to track sales performance to objectives Reviews various Business Information and Analysis reporting tools to assess account performance and generate reports for management Presents solutions to customers to gain approval of proposals and move forward with the sales cycle Executes on previously signed contracts (e.g., UPS Freight/ UPS Express) to introduce new products and services to customers and expand business within customer accounts Submits customer pricing requests to Pricing Analysts to generate new or revised price quotes Negotiates with internal groups (e.g., Pricing, Revenue Management, etc.) to create proposals and move forward with the sales process Analyzes price quotes to verify accuracy and determine how to propose the solution to the customer Setting up SOP – Sales Operating procedure or MOP – Master Operating Plans (e.g., later pick up times, etc.) to satisfy customer complaints. - Collaborates with operations to implement accounts with special needs (e.g., unique delivery schedules, extra conveyors, additional drivers, etc.) to adequately handle customers’ shipping needs Generates customer-facing reports to outline shipping history, billing history, and accounts’ incentives (i.e., contracted discounts) and renegotiate contracts Follows-up with customers to ensure customer trades to potential /commitment Trains customers on billing analysis tools and electronic billing files to facilitate report generation Trains customers on proper packaging techniques to avoid damages Participates in UPS online training classes to prepare for products and services assessments and quizzes and to stay current on industry knowledge Post Selling Facilitates research of customer complaints (e.g., late deliveries, damages, billing questions, etc.) to determine appropriate resolution personnel, discusses complaints with UPS personnel (e.g., Business Development Manager, business center managers, operations, billing, drivers, etc.) to determine corrective actions and resolutions Facilitates the drive for debt recovery from Customers in conjunction with F & A Facilitates proper on-boarding of new customers based on BD guidelines. Qualifications and Job Specification KPI: - Minimum – Bachelor’s Degree, Master in Business Management/additional diploma in Sales and Marketing preferable. Minimum 2-3 years of relevant international corporate sales experience in the similar industry. Professional selling skills, consultative selling techniques expert. Customer satisfaction and objection handling skills for maintenance and sustainability Excellent communication and presentation Skills Focus on Sale - Networks, and uses Experts/Tools/Systems to help in leveraging customer relationship, network and have a variety in new accounts High Energy levels planned for productive results towards sales funnel and calls, ability to stretch work hours and on opportunities and still remain enthusiastic ……………………………………………………………………………………………………………………………………………………………………… Compensation & Benefits breakdown: - Position will be offered to candidates under Local Terms of Employment. Type De Contrat en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés.
Posted 1 month ago
4.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – Senior – MDM EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. Your Key Responsibilities The candidate should possess a minimum of 4-8 years of direct involvement with Informatica IDMC SaaS development or equivalant MDM tool including a deep understanding of both Informatica Intelligent Cloud Services (IICS) and Intelligent Data Management Cloud (IDMC). The individual must exhibit a high level of expertise in CDI (Cloud Data Integration) and CAI (Cloud Application Integration), coupled with a robust foundation in API development, preferably within the Informatica framework. Should have experience in CDQ ( Cloud Data Quality) along with profiling experience in CDP (Cloud Data Profiling) Skills And Attributes For Success Cloud Data Stewardship: Employ advanced CDI tools to facilitate efficient data amalgamation within the cloud. Application Convergence: Execute CAI strategies to ensure uninterrupted interconnectivity amongst diverse applications. Data Integrity Assurance: Implement cutting-edge DQ instruments and protocols to maintain impeccable data standards throughout the infrastructure. API Architecting: Craft and execute sophisticated APIs tailored to meet a variety of integration needs. Analytical Collaboration: Engage in meticulous requirement gathering, interpret Business Requirement Documents (BRDs), and collaborate with system analysts to forge comprehensive source-to-target mappings. Deeply involve in strategic design, implementation, and optimization of data management solutions using Informatica, ensuring alignment with business objectives and driving data governance and integration across the enterprise. Data Scrutiny: Conduct thorough examinations of data within source databases prior to migration into data warehouses and generate detailed technical specifications in alignment with BRDs. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
4.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – Senior – MDM EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. Your Key Responsibilities The candidate should possess a minimum of 4-8 years of direct involvement with Informatica IDMC SaaS development or equivalant MDM tool including a deep understanding of both Informatica Intelligent Cloud Services (IICS) and Intelligent Data Management Cloud (IDMC). The individual must exhibit a high level of expertise in CDI (Cloud Data Integration) and CAI (Cloud Application Integration), coupled with a robust foundation in API development, preferably within the Informatica framework. Should have experience in CDQ ( Cloud Data Quality) along with profiling experience in CDP (Cloud Data Profiling) Skills And Attributes For Success Cloud Data Stewardship: Employ advanced CDI tools to facilitate efficient data amalgamation within the cloud. Application Convergence: Execute CAI strategies to ensure uninterrupted interconnectivity amongst diverse applications. Data Integrity Assurance: Implement cutting-edge DQ instruments and protocols to maintain impeccable data standards throughout the infrastructure. API Architecting: Craft and execute sophisticated APIs tailored to meet a variety of integration needs. Analytical Collaboration: Engage in meticulous requirement gathering, interpret Business Requirement Documents (BRDs), and collaborate with system analysts to forge comprehensive source-to-target mappings. Deeply involve in strategic design, implementation, and optimization of data management solutions using Informatica, ensuring alignment with business objectives and driving data governance and integration across the enterprise. Data Scrutiny: Conduct thorough examinations of data within source databases prior to migration into data warehouses and generate detailed technical specifications in alignment with BRDs. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
4.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – Senior – MDM EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. Your Key Responsibilities The candidate should possess a minimum of 4-8 years of direct involvement with Informatica IDMC SaaS development or equivalant MDM tool including a deep understanding of both Informatica Intelligent Cloud Services (IICS) and Intelligent Data Management Cloud (IDMC). The individual must exhibit a high level of expertise in CDI (Cloud Data Integration) and CAI (Cloud Application Integration), coupled with a robust foundation in API development, preferably within the Informatica framework. Should have experience in CDQ ( Cloud Data Quality) along with profiling experience in CDP (Cloud Data Profiling) Skills And Attributes For Success Cloud Data Stewardship: Employ advanced CDI tools to facilitate efficient data amalgamation within the cloud. Application Convergence: Execute CAI strategies to ensure uninterrupted interconnectivity amongst diverse applications. Data Integrity Assurance: Implement cutting-edge DQ instruments and protocols to maintain impeccable data standards throughout the infrastructure. API Architecting: Craft and execute sophisticated APIs tailored to meet a variety of integration needs. Analytical Collaboration: Engage in meticulous requirement gathering, interpret Business Requirement Documents (BRDs), and collaborate with system analysts to forge comprehensive source-to-target mappings. Deeply involve in strategic design, implementation, and optimization of data management solutions using Informatica, ensuring alignment with business objectives and driving data governance and integration across the enterprise. Data Scrutiny: Conduct thorough examinations of data within source databases prior to migration into data warehouses and generate detailed technical specifications in alignment with BRDs. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. Job Summary Fiche de poste : Sound understanding and experience with various testing methodologies (manual, automation, regression, performance), test planning and design, SQL and NoSQL, CI/CD specifically as it relates to test automation, experience with testing tools (Jmeter, SoapUI, Cucumber), strong analytical skills, scripting languages (Python, JavaScript) preferred. Experience with Agile Development Excellent written and verbal communication skills Must be a team player who shows initiative and is detail-oriented Coaching of Junior QA Staff __________________________________________________ This position provides mentorship and expertise in technologies and processes for Information Services Management (ISM) and Quality Assurance (QA). He/She maintains an awareness of emerging technologies to ensure a competitive advantage. This position automates test scenarios and expected outcomes. He/She provides expertise for UPS key business functions and supporting technologies. This position applies a comprehensive knowledge of technical skills, principles, practices, and procedures of testing methodologies and working knowledge in planning, designing, and conducting QA reviews and inspections. This position conducts comprehensive testing and risk-based assessments of the testing objects. He/She uses source documentation as input and contributes to the planning and implementation of testing activities. This position leads testing components of large and complex projects, assigns tasks, provides direction to resources, and reports progress to project stakeholders. He/She creates and selects tools and methodologies for review and approval by management. Responsibilities Conducts quality assessment (QA) development processes. Develops test solutions. Provides expertise in testing across the QA organization. Develops and implements new practices and testing standards. Contributes to project design. Qualifications Bachelor's Degree or International equivalent Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics or related field - Preferred Experience with both web and client/server based testing Type De Contrat en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés.
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. About UPS Fiche de poste : UPS is a global leader in logistics, offering a broad range of solutions that include transportation, distribution, supply chain management, and e-commerce. Founded in 1907, UPS operates in over 220 countries and territories, delivering packages and providing specialized services worldwide. Our mission is to enable commerce by connecting people, places, and businesses, with a strong focus on sustainability and innovation. About UPS Supply Chain Symphony™ The UPS Supply Chain Symphony™ platform is a cloud-based solution that seamlessly integrates key supply chain components, including shipping, warehousing, and inventory management, into a unified platform. This solution empowers businesses by offering enhanced visibility, advanced analytics, and customizable dashboards to streamline global supply chain operations and decision-making. About The Role We are seeking Data Developer to join our data engineering team responsible for building and maintaining complex data solutions using Azure Data Factory (ADF), Azure Databricks , and Cosmos DB . The role involves designing and developing scalable data pipelines, implementing data transformations, and ensuring high data quality and performance. Work closely with data architects, testers, and analysts to deliver robust data solutions that support strategic business initiatives. The ideal candidate should possess deep expertise in big data technologies, data integration, and cloud-native data engineering solutions on Microsoft Azure. Primary Skills Data Engineering: Azure Data Factory (ADF), Azure Databricks. Cloud Platform: Microsoft Azure (Data Lake Storage, Cosmos DB). Data Modeling: NoSQL data modeling, Data warehousing concepts. Performance Optimization: Data pipeline performance tuning and cost optimization. Programming Languages: Python, SQL, PySpark Secondary Skills DevOps and CI/CD: Azure DevOps, CI/CD pipeline design and automation. Security and Compliance: Implementing data security and governance standards. Agile Methodologies: Experience in Agile/Scrum environments. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Educational Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications in Azure and Data Engineering, such as: Microsoft Certified: Azure Data Engineer Associate Microsoft Certified: Azure Solutions Architect Expert Databricks Certified Data Engineer Associate or Professional Type De Contrat en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés.
Posted 1 month ago
3.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a detail-oriented and proactive AWS DataOps Engineer to join our growing data team. In this role, you will be responsible for supporting and optimizing cloud-based data pipelines and ETL workflows across AWS services. You will collaborate with analytics, engineering, and operations teams to ensure the secure, reliable, and scalable movement and transformation of data. Your Key Responsibilities Monitor and maintain data pipelines using AWS Glue, EMR, Lambda, and Amazon S3. Support and enhance ETL workflows leveraging IICS (Informatica Intelligent Cloud Services), Databricks, and other AWS-native tools. Collaborate with engineering teams to manage ingestion pipelines into Amazon Redshift and perform data quality validations. Assist in job scheduling and orchestration via Apache Airflow, AWS Data Pipeline, or similar tools. Write and debug SQL queries across Redshift and other AWS databases for data analysis and transformation. Troubleshoot and perform root cause analysis of pipeline failures and performance issues in distributed systems. Participate in deployment activities using version control and CI/CD pipelines. Create and maintain SOPs, runbooks, and documentation for operational workflows. Work closely with vendors and internal teams to maintain high system availability and ensure compliance. Skills And Attributes For Success Strong knowledge of AWS data services and architecture. Ability to analyze complex workflows and proactively resolve issues related to performance or data quality. Solid troubleshooting and problem-solving skills with a strong attention to detail. Effective communication skills for collaborating across teams and documenting findings or standard practices. A self-motivated learner with a passion for process improvement and cloud technologies. Comfortable handling multiple tasks and shifting priorities in a dynamic environment. To qualify for the role, you must have 2–3 years of experience in DataOps or Data Engineering roles Hands-on experience with Databricks for data engineering and transformation. Understanding of ETL processes and best practices in data movement. Working knowledge of Amazon S3, EMR (Elastic MapReduce), AWS Glue, and Lambda. Experience with Amazon Redshift, including querying and managing large analytical datasets. Familiarity with job orchestration tools like Apache Airflow or AWS Data Pipeline. Experience in IICS (Informatica Intelligent Cloud Services) or equivalent ETL tools. SQL skills for data transformation, validation, and performance tuning. Technologies and Tools Must haves S3, EMR (Elastic MapReduce), and Glue for data processing and orchestration Databricks – ability to understand and run existing notebooks for data transformation Amazon Redshift for data warehousing and SQL-based analysis Apache Airflow or AWS Data Pipeline AWS Lamda Basic operational experience with IICS (Informatica Intelligent Cloud Services) or similar ETL platforms Good to have Exposure to Power BI or Tableau for data visualization and dashboard creation. Knowledge of CDI, Informatica, or other enterprise data integration platforms. Understanding of DevOps tools and practices, especially in data pipeline CI/CD contexts. What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a detail-oriented and proactive AWS DataOps Engineer to join our growing data team. In this role, you will be responsible for supporting and optimizing cloud-based data pipelines and ETL workflows across AWS services. You will collaborate with analytics, engineering, and operations teams to ensure the secure, reliable, and scalable movement and transformation of data. Your Key Responsibilities Monitor and maintain data pipelines using AWS Glue, EMR, Lambda, and Amazon S3. Support and enhance ETL workflows leveraging IICS (Informatica Intelligent Cloud Services), Databricks, and other AWS-native tools. Collaborate with engineering teams to manage ingestion pipelines into Amazon Redshift and perform data quality validations. Assist in job scheduling and orchestration via Apache Airflow, AWS Data Pipeline, or similar tools. Write and debug SQL queries across Redshift and other AWS databases for data analysis and transformation. Troubleshoot and perform root cause analysis of pipeline failures and performance issues in distributed systems. Participate in deployment activities using version control and CI/CD pipelines. Create and maintain SOPs, runbooks, and documentation for operational workflows. Work closely with vendors and internal teams to maintain high system availability and ensure compliance. Skills And Attributes For Success Strong knowledge of AWS data services and architecture. Ability to analyze complex workflows and proactively resolve issues related to performance or data quality. Solid troubleshooting and problem-solving skills with a strong attention to detail. Effective communication skills for collaborating across teams and documenting findings or standard practices. A self-motivated learner with a passion for process improvement and cloud technologies. Comfortable handling multiple tasks and shifting priorities in a dynamic environment. To qualify for the role, you must have 2–3 years of experience in DataOps or Data Engineering roles Hands-on experience with Databricks for data engineering and transformation. Understanding of ETL processes and best practices in data movement. Working knowledge of Amazon S3, EMR (Elastic MapReduce), AWS Glue, and Lambda. Experience with Amazon Redshift, including querying and managing large analytical datasets. Familiarity with job orchestration tools like Apache Airflow or AWS Data Pipeline. Experience in IICS (Informatica Intelligent Cloud Services) or equivalent ETL tools. SQL skills for data transformation, validation, and performance tuning. Technologies and Tools Must haves S3, EMR (Elastic MapReduce), and Glue for data processing and orchestration Databricks – ability to understand and run existing notebooks for data transformation Amazon Redshift for data warehousing and SQL-based analysis Apache Airflow or AWS Data Pipeline AWS Lamda Basic operational experience with IICS (Informatica Intelligent Cloud Services) or similar ETL platforms Good to have Exposure to Power BI or Tableau for data visualization and dashboard creation. Knowledge of CDI, Informatica, or other enterprise data integration platforms. Understanding of DevOps tools and practices, especially in data pipeline CI/CD contexts. What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
3.0 years
0 Lacs
Kolkata, West Bengal, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a detail-oriented and proactive AWS DataOps Engineer to join our growing data team. In this role, you will be responsible for supporting and optimizing cloud-based data pipelines and ETL workflows across AWS services. You will collaborate with analytics, engineering, and operations teams to ensure the secure, reliable, and scalable movement and transformation of data. Your Key Responsibilities Monitor and maintain data pipelines using AWS Glue, EMR, Lambda, and Amazon S3. Support and enhance ETL workflows leveraging IICS (Informatica Intelligent Cloud Services), Databricks, and other AWS-native tools. Collaborate with engineering teams to manage ingestion pipelines into Amazon Redshift and perform data quality validations. Assist in job scheduling and orchestration via Apache Airflow, AWS Data Pipeline, or similar tools. Write and debug SQL queries across Redshift and other AWS databases for data analysis and transformation. Troubleshoot and perform root cause analysis of pipeline failures and performance issues in distributed systems. Participate in deployment activities using version control and CI/CD pipelines. Create and maintain SOPs, runbooks, and documentation for operational workflows. Work closely with vendors and internal teams to maintain high system availability and ensure compliance. Skills And Attributes For Success Strong knowledge of AWS data services and architecture. Ability to analyze complex workflows and proactively resolve issues related to performance or data quality. Solid troubleshooting and problem-solving skills with a strong attention to detail. Effective communication skills for collaborating across teams and documenting findings or standard practices. A self-motivated learner with a passion for process improvement and cloud technologies. Comfortable handling multiple tasks and shifting priorities in a dynamic environment. To qualify for the role, you must have 2–3 years of experience in DataOps or Data Engineering roles Hands-on experience with Databricks for data engineering and transformation. Understanding of ETL processes and best practices in data movement. Working knowledge of Amazon S3, EMR (Elastic MapReduce), AWS Glue, and Lambda. Experience with Amazon Redshift, including querying and managing large analytical datasets. Familiarity with job orchestration tools like Apache Airflow or AWS Data Pipeline. Experience in IICS (Informatica Intelligent Cloud Services) or equivalent ETL tools. SQL skills for data transformation, validation, and performance tuning. Technologies and Tools Must haves S3, EMR (Elastic MapReduce), and Glue for data processing and orchestration Databricks – ability to understand and run existing notebooks for data transformation Amazon Redshift for data warehousing and SQL-based analysis Apache Airflow or AWS Data Pipeline AWS Lamda Basic operational experience with IICS (Informatica Intelligent Cloud Services) or similar ETL platforms Good to have Exposure to Power BI or Tableau for data visualization and dashboard creation. Knowledge of CDI, Informatica, or other enterprise data integration platforms. Understanding of DevOps tools and practices, especially in data pipeline CI/CD contexts. What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a detail-oriented and proactive AWS DataOps Engineer to join our growing data team. In this role, you will be responsible for supporting and optimizing cloud-based data pipelines and ETL workflows across AWS services. You will collaborate with analytics, engineering, and operations teams to ensure the secure, reliable, and scalable movement and transformation of data. Your Key Responsibilities Monitor and maintain data pipelines using AWS Glue, EMR, Lambda, and Amazon S3. Support and enhance ETL workflows leveraging IICS (Informatica Intelligent Cloud Services), Databricks, and other AWS-native tools. Collaborate with engineering teams to manage ingestion pipelines into Amazon Redshift and perform data quality validations. Assist in job scheduling and orchestration via Apache Airflow, AWS Data Pipeline, or similar tools. Write and debug SQL queries across Redshift and other AWS databases for data analysis and transformation. Troubleshoot and perform root cause analysis of pipeline failures and performance issues in distributed systems. Participate in deployment activities using version control and CI/CD pipelines. Create and maintain SOPs, runbooks, and documentation for operational workflows. Work closely with vendors and internal teams to maintain high system availability and ensure compliance. Skills And Attributes For Success Strong knowledge of AWS data services and architecture. Ability to analyze complex workflows and proactively resolve issues related to performance or data quality. Solid troubleshooting and problem-solving skills with a strong attention to detail. Effective communication skills for collaborating across teams and documenting findings or standard practices. A self-motivated learner with a passion for process improvement and cloud technologies. Comfortable handling multiple tasks and shifting priorities in a dynamic environment. To qualify for the role, you must have 2–3 years of experience in DataOps or Data Engineering roles Hands-on experience with Databricks for data engineering and transformation. Understanding of ETL processes and best practices in data movement. Working knowledge of Amazon S3, EMR (Elastic MapReduce), AWS Glue, and Lambda. Experience with Amazon Redshift, including querying and managing large analytical datasets. Familiarity with job orchestration tools like Apache Airflow or AWS Data Pipeline. Experience in IICS (Informatica Intelligent Cloud Services) or equivalent ETL tools. SQL skills for data transformation, validation, and performance tuning. Technologies and Tools Must haves S3, EMR (Elastic MapReduce), and Glue for data processing and orchestration Databricks – ability to understand and run existing notebooks for data transformation Amazon Redshift for data warehousing and SQL-based analysis Apache Airflow or AWS Data Pipeline AWS Lamda Basic operational experience with IICS (Informatica Intelligent Cloud Services) or similar ETL platforms Good to have Exposure to Power BI or Tableau for data visualization and dashboard creation. Knowledge of CDI, Informatica, or other enterprise data integration platforms. Understanding of DevOps tools and practices, especially in data pipeline CI/CD contexts. What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
3.0 years
0 Lacs
Kanayannur, Kerala, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a detail-oriented and proactive AWS DataOps Engineer to join our growing data team. In this role, you will be responsible for supporting and optimizing cloud-based data pipelines and ETL workflows across AWS services. You will collaborate with analytics, engineering, and operations teams to ensure the secure, reliable, and scalable movement and transformation of data. Your Key Responsibilities Monitor and maintain data pipelines using AWS Glue, EMR, Lambda, and Amazon S3. Support and enhance ETL workflows leveraging IICS (Informatica Intelligent Cloud Services), Databricks, and other AWS-native tools. Collaborate with engineering teams to manage ingestion pipelines into Amazon Redshift and perform data quality validations. Assist in job scheduling and orchestration via Apache Airflow, AWS Data Pipeline, or similar tools. Write and debug SQL queries across Redshift and other AWS databases for data analysis and transformation. Troubleshoot and perform root cause analysis of pipeline failures and performance issues in distributed systems. Participate in deployment activities using version control and CI/CD pipelines. Create and maintain SOPs, runbooks, and documentation for operational workflows. Work closely with vendors and internal teams to maintain high system availability and ensure compliance. Skills And Attributes For Success Strong knowledge of AWS data services and architecture. Ability to analyze complex workflows and proactively resolve issues related to performance or data quality. Solid troubleshooting and problem-solving skills with a strong attention to detail. Effective communication skills for collaborating across teams and documenting findings or standard practices. A self-motivated learner with a passion for process improvement and cloud technologies. Comfortable handling multiple tasks and shifting priorities in a dynamic environment. To qualify for the role, you must have 2–3 years of experience in DataOps or Data Engineering roles Hands-on experience with Databricks for data engineering and transformation. Understanding of ETL processes and best practices in data movement. Working knowledge of Amazon S3, EMR (Elastic MapReduce), AWS Glue, and Lambda. Experience with Amazon Redshift, including querying and managing large analytical datasets. Familiarity with job orchestration tools like Apache Airflow or AWS Data Pipeline. Experience in IICS (Informatica Intelligent Cloud Services) or equivalent ETL tools. SQL skills for data transformation, validation, and performance tuning. Technologies and Tools Must haves S3, EMR (Elastic MapReduce), and Glue for data processing and orchestration Databricks – ability to understand and run existing notebooks for data transformation Amazon Redshift for data warehousing and SQL-based analysis Apache Airflow or AWS Data Pipeline AWS Lamda Basic operational experience with IICS (Informatica Intelligent Cloud Services) or similar ETL platforms Good to have Exposure to Power BI or Tableau for data visualization and dashboard creation. Knowledge of CDI, Informatica, or other enterprise data integration platforms. Understanding of DevOps tools and practices, especially in data pipeline CI/CD contexts. What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39815 Jobs | Dublin
Wipro
19317 Jobs | Bengaluru
Accenture in India
15105 Jobs | Dublin 2
EY
14860 Jobs | London
Uplers
11139 Jobs | Ahmedabad
Amazon
10431 Jobs | Seattle,WA
IBM
9214 Jobs | Armonk
Oracle
9174 Jobs | Redwood City
Accenture services Pvt Ltd
7676 Jobs |
Capgemini
7672 Jobs | Paris,France