Jobs
Interviews

4 Etl Methodologies Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 11.0 years

0 Lacs

chennai, tamil nadu

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We're looking for Managers (Big Data Architects) with strong technology and data understanding having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your key responsibilities Develop standardized practices for delivering new products and capabilities using Big Data & cloud technologies, including data acquisition, transformation, analysis, Modelling, Governance & Data management skills. Interact with senior client technology leaders, understand their business goals, create, propose solution, estimate effort, build architectures, develop and deliver technology solutions. Define and develop client specific best practices around data management within a cloud environment. Recommend design alternatives for data ingestion, processing, and provisioning layers. Design and develop data ingestion programs to process large data sets in Batch mode using ADB, ADF, PySpark, Python, Synapse. Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies. Have managed team and have experience in end-to-end delivery. Have experience of building technical capability and teams to deliver. Skills and attributes for success Strong understanding & familiarity with all Cloud Ecosystem components. Strong understanding of underlying Cloud Architectural concepts and distributed computing paradigms. Experience in the development of large scale data processing. Hands-on programming experience in ADB, ADF, Synapse, Python, PySpark, SQL. Hands-on expertise in cloud services like AWS, and/or Microsoft Azure ecosystem. Solid understanding of ETL methodologies in a multi-tiered stack with Data Modelling & Data Governance. Experience with BI, and data analytics databases. Experience in converting business problems/challenges to technical solutions considering security, performance, scalability, etc. Experience in Enterprise grade solution implementations. Experience in performance benchmarking enterprise applications. Strong stakeholder, client, team, process & delivery management skills. To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience. Ideally, you'll also have Project management skills. Client management skills. Solutioning skills. What we look for People with technical experience and enthusiasm to learn new things in this fast-moving environment. What working at EY offers At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies, and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around. Opportunities to develop new skills and progress your career. The freedom and flexibility to handle your role in a way that's right for you. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Engineer, you will be responsible for designing, building, and measuring complex ELT jobs to process disparate data sources and create a high integrity, high quality, clean data asset. You will work on various projects, including batch pipelines, data modeling, and data mart solutions, as part of collaborative teams to implement robust data collection and processing pipelines to meet specific business needs. Your main goals will include executing and providing feedback on data modeling policies, procedures, processes, and standards, as well as capturing and documenting system flow and technical information related to data, database design, and systems. You will be expected to develop data quality standards and tools to ensure accuracy, understand new data patterns across departments, and translate high-level business requirements into technical specifications. To qualify for this role, you should have a Bachelor's degree in computer science or engineering, along with at least 3 years of experience in data analytics, data modeling, and database design. You should also have 3+ years of coding and scripting experience (Python, Java, Pyspark) and familiarity with ETL methodologies and tools, particularly Vertica. Expertise in tuning and troubleshooting SQL, strong data integrity, analytical, and multitasking skills, as well as excellent communication, problem-solving, organizational, and analytical abilities are essential. The ability to work independently is also required. Additional skills that would be preferred for this position include familiarity with agile project delivery processes, experience with Airflow, and the ability to manage diverse projects impacting multiple roles and processes. You should be adept at troubleshooting problem areas, identifying data gaps and issues, and adapting to a fast-changing environment. Experience with Python, basic knowledge of database technologies (Vertica, Redshift, etc.), and experience in designing and implementing automated ETL processes would also be advantageous.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Teradata ETL Developer, you will be responsible for designing, developing, and implementing ETL processes using Teradata tools like BTEQ and TPT Utility. Your role will involve optimizing and enhancing existing ETL workflows to improve performance and reliability. Collaboration with cross-functional teams to gather data requirements and translate them into technical specifications will be a key aspect of your responsibilities. Data profiling, cleansing, and validation will also be part of your duties to ensure data quality and integrity. Monitoring ETL processes, troubleshooting any issues in the data pipeline, and participating in the technical design and architecture of data integration solutions are critical tasks you will perform. Additionally, documenting ETL processes, data mapping, and operational procedures for future reference and compliance will be essential. To excel in this role, you should possess proven experience as a Teradata ETL Developer with a strong understanding of BTEQ and TPT Utility. A solid grasp of data warehousing concepts, ETL methodologies, and data modeling is required. Proficiency in SQL, including the ability to write complex queries for data extraction and manipulation, is essential. Familiarity with data integration tools and techniques, especially in a Teradata environment, will be beneficial. Strong analytical and problem-solving skills are necessary to diagnose and resolve ETL issues efficiently. You should be able to work collaboratively in a team environment while also demonstrating self-motivation and attention to detail. Excellent communication skills are a must to effectively engage with both technical and non-technical stakeholders.,

Posted 1 month ago

Apply

2.0 - 5.0 years

2 - 5 Lacs

Bengaluru, Karnataka, India

On-site

We are looking for a meticulous and analytical ETL (Extract, Transform, Load) Tester to ensure the quality and integrity of our data warehousing and business intelligence solutions. The ideal candidate will be responsible for designing, developing, and executing comprehensive test plans and cases for ETL processes, data transformations, and data loads. This role requires a strong understanding of data warehousing concepts, SQL proficiency, and a keen eye for detail to validate data accuracy and consistency. Roles and Responsibilities: Analyze ETL mapping documents, source-to-target mappings, and business requirements to understand data flow and transformation rules. Design, develop, and execute comprehensive test plans, test cases, and test scripts for ETL processes and data warehouse components. Perform extensive data validation and reconciliation between source and target systems to ensure data accuracy, completeness, and integrity. Write complex SQL queries to validate data transformations, data loads, and referential integrity constraints. Identify, document, and track defects using defect management tools, collaborating with developers to ensure timely resolution. Develop and maintain reusable SQL scripts and testing artifacts for future testing cycles. Validate data aggregation, calculations, and reporting logic within the data warehouse environment. Participate in all phases of the SDLC (Software Development Life Cycle) and STLC (Software Testing Life Cycle) with a focus on data quality. Collaborate closely with ETL developers, data architects, business analysts, and report developers to ensure alignment on data requirements and quality standards. Perform performance testing on ETL processes to ensure optimal loading times and system scalability. Contribute to the continuous improvement of ETL testing processes and methodologies. Generate test reports and metrics to communicate testing progress and data quality status to stakeholders. Required Skills and Qualifications: Proven experience as an ETL Tester or Data Warehouse Tester. Strong proficiency in SQL for data querying, analysis, and validation (including complex joins, subqueries, and analytical functions). Solid understanding of data warehousing concepts, ETL methodologies, and dimensional modeling (Star Schema, Snowflake Schema) . Experience with various ETL tools (e.g., Informatica PowerCenter, DataStage, SSIS, Talend) for understanding data flows. Ability to read and interpret ETL mapping documents and data models. Experience with defect tracking tools (e.g., Jira, Azure DevOps) and test management tools. Strong analytical and problem-solving skills with meticulous attention to detail. Excellent communication skills (written and verbal) and ability to articulate complex data issues clearly. Knowledge of scripting languages (e.g., Python) for automation of test data generation or validation is a plus.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies