Jobs
Interviews

921 Sqoop Jobs - Page 22

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

The Applications Development Senior Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Conduct tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establish and implement new or revised applications systems and programs to meet specific business needs or user areas Monitor and control all phases of development process and analysis, design, construction, testing, and implementation as well as provide user and operational support on applications to business users Utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, provide evaluation of business process, system process, and industry standards, and make evaluative judgement Recommend and develop security measures in post implementation analysis of business usage to ensure successful system design and functionality Consult with users/clients and other technology groups on issues, recommend advanced programming solutions, and install and assist customer exposure systems Ensure essential procedures are followed and help define operating standards and processes Serve as advisor or coach to new or lower level analysts Has the ability to operate with a limited level of direct supervision. Can exercise independence of judgement and autonomy. Acts as SME to senior stakeholders and /or other team members. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 8-12 years of relevant experience Experience in systems analysis and programming of software applications Experience in managing and implementing successful projects Working knowledge of consulting/project management techniques/methods Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. 8 to 12 years of Application development experience through full lifecycle SME for UI architecture patterns - Micro Frontend, NX. The primary area of Experience with Core Java/J2EE Application with complete command over OOPs and Design Patterns. The candidate should be commendable in Data Structures and Algorithms. He should have worked on Core Application Development of complex size encompassing all areas of Java/J2EE. She/He should Thorough knowledge and hands on experience in following technologies Hadoop, Map Reduce Framework, Spark, YARN, Sqoop, Pig , Hue, Unix, Java, Sqoop, Impala, Cassandra on Mesos. Cloudera certification (CCDH) is an added advantage. She/He should have implemented or part complex project execution in Big Data Spark eco system, where processing volumes of data thorough understanding of distributed processing and integrated applications. Exposure to ETL and BI tools will be good. Work in an agile environment following through the best practices of agile Scrum. Expertise in designing and optimizing the software solutions for performance and stability. Expertise in troubleshooting and problem solving. Expertise in Test driven development. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 month ago

Apply

1.0 - 4.0 years

6 - 9 Lacs

Pune

Work from Office

Join us as a Pyspark Engineer at Barclays, responsible for supporting the successful delivery of location strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences.. To be successful as a Pyspark Engineer you should have experience with:. Pyspark. AWS. Snowflake. Datawarehouse technologies. Some Other Highly Valued Skills May Include. DevOps tools. Airflow. Iceberg. Agile Methodologies. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills.. This role is based out of Pune.. Purpose of the role. To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure.. Accountabilities. Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data.. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures.. Development of processing and analysis algorithms fit for the intended data complexity and volumes.. Collaboration with data scientist to build and deploy machine learning models.. Assistant Vice President Expectations. To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions.. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others.. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes.. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues.. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda.. Take ownership for managing risk and strengthening controls in relation to the work done.. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function.. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy.. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively.. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience.. Influence or convince stakeholders to achieve outcomes.. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.. Show more Show less

Posted 1 month ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Experience: 3+ Years. As a Senior Data Engineer, you’ll build robust data pipelines and enable data-driven decisions by developing scalable solutions for analytics and reporting. Perfect for someone with strong database and ETL expertise.. Job Responsibilities-. Design, build, and maintain scalable data pipelines and ETL processes.. Work with large data sets from diverse sources.. Develop and optimize data models, warehouses, and integrations.. Collaborate with data scientists, analysts, and product teams.. Ensure data quality, security, and compliance standards.. Qualifications-. Proficiency in SQL, Python, and data pipeline tools (Airflow, Spark).. Experience with data warehouses (Redshift, Snowflake, BigQuery).. Knowledge of cloud platforms (AWS/GCP/Azure).. Strong problem-solving and analytical skills.. Show more Show less

Posted 1 month ago

Apply

6.0 - 11.0 years

9 - 13 Lacs

Ahmedabad

Work from Office

Artic Consulting is looking for Data Engineer - Microsoft Fabric Focus to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 1 month ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

This role involves the development and application of engineering practice and knowledge in defining, configuring and deploying industrial digital technologies (including but not limited to PLM and MES) for managing continuity of information across the engineering enterprise, including design, industrialization, manufacturing and supply chain, and for managing the manufacturing data. - Grade Specific Focus on Digital Continuity and Manufacturing. Develops competency in own area of expertise. Shares expertise and provides guidance and support to others. Interprets clients needs. Completes own role independently or with minimum supervision. Identifies problems and relevant issues in straight forward situations and generates solutions. Contributes in teamwork and interacts with customers.

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio Your Role And Responsibilities Developer leads the cloud application development/deployment. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Preferred Education Master's Degree Required Technical And Professional Expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA, Java based Micro-services framework, Hands on experience on Spring boot Microservices, Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices- Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc)- Spark Good to have Python. Strong knowledge of micro-service logging, monitoring, debugging and testing, In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes, experience in messaging platforms such as Kafka or IBM MQ, good understanding of Test-Driven-Development Familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands, Experience in Concurrent design and multi-threading Preferred Technical And Professional Experience None

Posted 1 month ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Responsibilities Lead and mentored a team of Python developers. Design, develop, and maintain highly scalable data processing applications. Write efficient, reusable, and well-documented code. Deliver big data projects using Spark, Scala, Python, SQL, HQL, and Hive. Leverage data pipelining application to package work. Maintain and tune existing Hadoop applications. Work closely with QA, Operations, and various teams to deliver error-free software on time. Perform code reviews and provide constructive feedback. Actively participate in daily agile / scrum meetings. Design, develop, and maintain highly scalable data processing applications. Write efficient, reusable, and well-documented code. Deliver big data projects using Spark, Scala, Python, SQL, HQL, and Hive. Leverage data pipelining application to package work. Maintain and tune existing Hadoop applications. Work closely with QA, Operations and various teams to deliver error-free software on time. Actively participate in daily agile / scrum meetings. Requirements 7+ years of software development experience with Hadoop framework components(HDFS, Spark, PySpark, Sqoop, Hive, HQL, Spark, Scala). Experience in a leadership or supervisory role. 4+ years of experience using Python, SQL, and shell scripting. Experience in developing and tuning spark applications. Excellent understanding of spark architecture, data frames, and tuning spark. Strong knowledge of database concepts, systems architecture, and data structures is a must. Process-oriented with strong analytical and problem-solving skills. Excellent written and verbal communication skills. Bachelor's degree in Computer Science or related field. 5+ years of software development experience with Hadoop framework components(HDFS, Spark, PySpark, Sqoop, Hive, HQL, Spark, Scala). 4+ years of experience using Python, SQL, and shell scripting. Experience in developing and tuning spark applications. Excellent understanding of spark architecture, data frames, and tuning spark. Strong knowledge of database concepts, systems architecture, and data structures is a must. Process-oriented with strong analytical and problem-solving skills. Excellent written and verbal communication skills. Bachelor's degree in Computer Science or related field. This job was posted by Manisha Rani from Amantya Technologies.

Posted 1 month ago

Apply

2.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Profile: Sr. DW BI Developer Location: Sector 64, Noida (Work from Office) Position Overview: Working with the Finance Systems Manager, the role will ensure that ERP system is available and fit for purpose. The ERP Systems Developer will be developing the ERP system, providing comprehensive day-to-day support, training and develop the current ERP System for the future. Key Responsibilities: As a Sr. DW BI Developer, the candidate will participate in the design / development / customization and maintenance of software applications. As a DW BI Developer, the person should analyse the different applications/Products, design and implement DW using best practices. Rich data governance experience, data security, data quality, provenance / lineage. The candidate will also be maintaining a close working relationship with the other application stakeholders. Experience of developing secured and high-performance web application(s) Knowledge of software development life-cycle methodologies e.g. Iterative, Waterfall, Agile, etc. Designing and architecting future releases of the platform. Participating in troubleshooting application issues. Jointly working with other teams and partners handling different aspects of the platform creation. Tracking advancements in software development technologies and applying them judiciously in the solution roadmap. Ensuring all quality controls and processes are adhered to. Planning the major and minor releases of the solution. Ensuring robust configuration management. Working closely with the Engineering Manager on different aspects of product lifecycle management. Demonstrate the ability to independently work in a fast-paced environment requiring multitasking and efficient time management. Required Skills and Qualifications: End to end Lifecyle of Data warehousing, DataLakes and reporting Experience with Maintaining/Managing Data warehouses. Responsible for the design and development of a large, scaled-out, real-time, high performing Data Lake / Data Warehouse systems (including Big data and Cloud). Strong SQL and analytical skills. Experience in Power BI, Tableau, Qlikview, Qliksense etc. Experience in Microsoft Azure Services. Experience in developing and supporting ADF pipelines. Experience in Azure SQL Server/ Databricks / Azure Analysis Services Experience in developing tabular model. Experience in working with APIs. Minimum 2 years of experience in a similar role Experience with data warehousing, data modelling. Strong experience in SQL 2-6 years of total experience in building DW/BI systems Experience with ETL and working with large-scale datasets. Proficiency in writing and debugging complex SQLs. Prior experience working with global clients. Hands on experience with Kafka, Flink, Spark, SnowFlake, Airflow, nifi, Oozie, Pig, Hive,Impala Sqoop. Storage like HDFS , Object Storage (S3 etc), RDBMS, MPP and Nosql DB. Experience with distributed data management, data sfailover,luding databases (Relational, NoSQL, Big data, data analysis, data processing, data transformation, high availability, and scalability) Experience in end-to-end project implementation in Cloud (Azure / AWS / GCP) as a DW BI Developer Rich data governance experience, data security, data quality, provenance / lineagHive, Impalaerstanding of industry trends and products in dataops , continuous intelligence , Augmented analytics , and AI/ML. Prior experience of working in cloud like Azure, AWS and GCP Prior experience of working with Global Clients To know our Privacy Policy, please click on the link below or copy paste the URL on your browser: https://gedu.global/wp-content/uploads/2023/09/GEDU-Privacy-Policy-22092023-V2.0-1.pdf

Posted 1 month ago

Apply

5.0 - 12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Responsibilities: 5 to 12 Years Experienced Design, develop, and deploy high-quality data processing applications and pipelines. Analyze and optimize existing data workflows, pipelines, and data integration processes . Develop highly scalable, testable, and maintainable code for data transformation and storage. Troubleshoot and resolve data-related issues and performance bottlenecks . Collaborate with cross-functional teams to understand data requirements and deliver solutions. Qualifications: Bachelor's degree or equivalent experience in Computer Science, Information Technology, or related field . Hands-on development experience with Python and Apache Spark . Strong knowledge of Big Data technologies such as Hadoop, HDFS, Hive, Sqoop, Kafka, RabbitMQ . Proficiency in working with SQL databases or relational database systems (SQL Server, Oracle) . Familiarity with NoSQL databases like MongoDB, Cassandra, or HBase . Experience with Cloud platforms (AWS, Azure Databricks, GCP) is a plus. Understanding of ETL techniques, data integration, and Agile methodologies .

Posted 1 month ago

Apply

3.0 - 5.0 years

8 - 15 Lacs

Hyderabad

Work from Office

Understanding the requirements and developing ADF pipelines Good knowledge of data bricks Strong understanding of the existing ADF pipelines and enhancements Deployment and Monitoring ADF Jobs Good understanding of SQL concepts and Strong in SQL query writing Understanding and writing the stored procedures Performance Tuning Roles and Responsibilities Understand business and data integration requirements. Design, develop, and implement scalable and reusable ADF pipelines for ETL/ELT processes. Leverage Databricks for advanced data transformations within ADF pipelines. Collaborate with data engineers to integrate ADF with Azure Databricks notebooks for big data processing. Analyze and understand existing ADF workflows. Implement improvements, optimize data flows, and incorporate new features based on evolving requirements. Manage deployment of ADF solutions across development, staging, and production environments. Set up monitoring, logging, and alerts to ensure smooth pipeline executions and troubleshoot failures. Write efficient and complex SQL queries to support data analysis and ETL tasks. Tune SQL queries for performance, especially in large-volume data scenarios. Design, develop, and maintain stored procedures for data transformation and business logic. Ensure procedures are optimized and modular for reusability and performance. Identify performance bottlenecks in queries and data processing routines. Apply indexing strategies, query refactoring, and execution plan analysis to enhance performance

Posted 1 month ago

Apply

4.0 - 6.0 years

10 - 20 Lacs

Noida

Hybrid

Designation: Senior Software Engineer/ Software Engineer - Data Engineering Location: Noida Experience: 4 -6 years Job Summary/ Your Role in a Nutshell: The ideal candidate would be a skilled Data Engineer proficient in Python, Scala, or Java with a strong background in Hadoop, Spark, SQL, and various data platforms and have expertise in optimizing the performance of data applications and contributing to rapid and agile development processes. What youll do: Review and understand business requirements ensuring that development tasks are completed within the timeline provided and that issues are fully tested with minimal defects Partner with a software development team to implement best practices and optimize the performance of Data applications to ensure that client needs are met at all times Collaborate across the company and interact with our customers to understand, translate, define, design their business challenges and concerns into innovative solutions Research on new Big Data technologies, assessing maturity and alignment of technology to business and technology strategy Work in a rapid and agile development process to enable increased speed to market while maintaining appropriate controls What you need: BE/B.Tech/MCA with At least 4+ years of experience in design and development using Data Engineering technology stack and programming languages Mandatory experience in following areas: Python/Scala/Java Hadoop, HDFS, MR Spark SQL, Dataframes, RDDs SQL Hive / Snowflake/SQL Server/Bigquery Elastic Search Preferred experience in 3 or more of the following areas: Spark Streaming, Spark ML Kafka/Flume Apache NiFi Apache Airflow/Oozie Cloud-based Data Platforms NoSQL Databases HBase/Cassandra/Neo4j/MongoDB Good knowledge of the current technology landscape and ability to visualize industry trends Working knowledge of Big Data Integration with Third-party or in-house built Metadata Management, Data Quality, and Master Data Management solutions Active community involvement through articles, blogs, or speaking engagements at conferences

Posted 1 month ago

Apply

1.0 - 5.0 years

1 - 4 Lacs

Nashik, Manmad

Work from Office

We are looking for a highly skilled and experienced Legal Officer to join our team at Equitas Small Finance Bank. Roles and Responsibility Manage and oversee legal matters related to mortgages and other financial products. Draft and review contracts, agreements, and other legal documents. Provide legal advice and guidance on various banking-related matters. Conduct legal research and analysis to support business decisions. Collaborate with cross-functional teams to ensure compliance with regulatory requirements. Develop and implement strategies to mitigate legal risks and minimize losses. Job Requirements Strong knowledge of legal principles and practices applicable to the BFSI industry. Experience working with SBL or similar institutions is preferred. Excellent analytical, communication, and problem-solving skills. Ability to work independently and as part of a team. Strong attention to detail and organizational skills. Familiarity with mortgage laws and regulations is desirable.

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Hubli

Work from Office

We are looking for a skilled Regional Receivables Manager to join our team at Equitas Small Finance Bank. Roles and Responsibility Manage and oversee regional receivables operations to ensure timely recovery of outstanding amounts. Develop and implement strategies to improve collection efficiency and reduce delinquencies. Collaborate with internal stakeholders to resolve customer complaints and disputes. Analyze and report on key performance indicators to identify areas for improvement. Ensure compliance with regulatory requirements and company policies. Lead and motivate a team of collection professionals to achieve targets. Job Requirements Strong knowledge of Inclusive Banking, SBL, and Mortgages concepts. Excellent communication and interpersonal skills. Ability to work in a fast-paced environment and meet deadlines. Strong analytical and problem-solving skills. Experience in managing teams and leading by example. Familiarity with financial software and systems is desirable.

Posted 1 month ago

Apply

2.0 - 6.0 years

7 - 11 Lacs

Coimbatore, Erode, Gandhi

Work from Office

We are looking for a highly skilled and experienced Relationship Manager to join our team at Equitas Small Finance Bank. Roles and Responsibility Manage and maintain strong relationships with existing clients to increase business growth. Identify new business opportunities and develop strategies to expand the client base. Provide excellent customer service and support to ensure high levels of client satisfaction. Collaborate with internal teams to achieve sales targets and improve overall performance. Develop and implement effective relationship management plans to drive business results. Analyze market trends and competitor activity to stay ahead in the industry. Job Requirements Strong knowledge of Assets, Inclusive Banking, SBL, Mortgages, Standalone Merchant OD, and Relationship Management. Excellent communication and interpersonal skills are required to build strong relationships with clients and colleagues. Ability to work in a fast-paced environment and meet sales targets. Strong analytical and problem-solving skills to analyze market trends and competitor activity. Experience working in the BFSI industry with a focus on relationship management and business growth. Ability to work collaboratively as part of a team to achieve business objectives.

Posted 1 month ago

Apply

1.0 - 5.0 years

7 - 8 Lacs

Bengaluru

Work from Office

Diverse Lynx is looking for Snowflake Developer to join our dynamic team and embark on a rewarding career journey A Snowflake Developer is responsible for designing and developing data solutions within the Snowflake cloud data platform They play a critical role in helping organizations to store, process, and analyze their data effectively and efficiently Responsibilities:Design and develop data solutions within the Snowflake cloud data platform, including data warehousing, data lake, and data modeling solutionsParticipate in the design and implementation of data migration strategiesEnsure the quality of custom solutions through the implementation of appropriate testing and debugging proceduresProvide technical support and troubleshoot issues as neededStay up-to-date with the latest developments in the Snowflake platform and data warehousing technologiesContribute to the ongoing improvement of development processes and best practices Requirements:Experience in data warehousing and data analyticsStrong knowledge of SQL and data warehousing conceptsExperience with Snowflake, or other cloud data platforms, is preferredAbility to analyze and interpret dataExcellent written and verbal communication skillsAbility to work independently and as part of a teamStrong attention to detail and ability to work in a fast-paced environment

Posted 1 month ago

Apply

1.0 - 4.0 years

1 - 4 Lacs

Hubli

Work from Office

We are looking for a highly skilled and experienced Legal Officer to join our team at Equitas Small Finance Bank. Roles and Responsibility Manage and oversee legal matters related to mortgages and other financial products. Provide legal support and guidance to internal stakeholders on various banking operations. Conduct legal research and analysis to ensure compliance with regulatory requirements. Develop and implement effective legal strategies to mitigate risks and protect the bank's interests. Collaborate with cross-functional teams to achieve business objectives. Ensure all legal documents and contracts are properly executed and stored. Job Requirements Strong knowledge of legal principles and practices applicable to the BFSI industry. Experience working with SBL or similar institutions is preferred. Excellent analytical and problem-solving skills with attention to detail. Ability to work independently and as part of a team. Strong communication and interpersonal skills. Familiarity with mortgage laws and regulations is essential.

Posted 1 month ago

Apply

7.0 - 12.0 years

8 - 13 Lacs

Chennai

Work from Office

Overview We are ooking for a highy skied Lead Engineer to spearhead our data and appication migration projects. The idea candidate wi have in-depth knowedge of coud migration strategies, especiay with AWS, and hands-on experience in arge-scae migration initiatives. This roe requires strong eadership abiities, technica expertise, and a keen understanding of both the source and target patforms. Responsibiities Lead end-to-end migration projects, incuding panning, design, testing, and impementation. Coaborate with stakehoders to define migration requirements and goas. Perform assessments of existing environments to identify the scope and compexity of migration tasks. Design and architect scaabe migration strategies, ensuring minima downtime and business continuity. Oversee the migration of on-premises appications, databases, and data warehouses to coud infrastructure. Ensure the security, performance, and reiabiity of migrated workoads. Provide technica eadership and guidance to the migration team, ensuring adherence to best practices. Troubeshoot and resove any technica chaenges reated to the migration process. Coaborate with cross-functiona teams, incuding infrastructure, deveopment, and security. Document migration procedures and essons earned for future reference. Lead end-to-end migration projects, incuding panning, design, testing, and impementation. Coaborate with stakehoders to define migration requirements and goas. Perform assessments of existing environments to identify the scope and compexity of migration tasks. Design and architect scaabe migration strategies, ensuring minima downtime and business continuity. Oversee the migration of on-premises appications, databases, and data warehouses to coud infrastructure. Ensure the security, performance, and reiabiity of migrated workoads. Provide technica eadership and guidance to the migration team, ensuring adherence to best practices. Troubeshoot and resove any technica chaenges reated to the migration process. Coaborate with cross-functiona teams, incuding infrastructure, deveopment, and security. Document migration procedures and essons earned for future reference.

Posted 1 month ago

Apply

2.0 - 5.0 years

13 - 17 Lacs

Hyderabad

Work from Office

As an Associate Software Deveoper at IBM you wi harness the power of data to unvei captivating stories and intricate patterns. You' contribute to data gathering, storage, and both batch and rea-time processing. Coaborating cosey with diverse teams, you' pay an important roe in deciding the most suitabe data management systems and identifying the crucia data required for insightfu anaysis. As a Data Engineer, you' tacke obstaces reated to database integration and untange compex, unstructured data sets. In this roe, your responsibiities may incude: Impementing and vaidating predictive modes as we as creating and maintain statistica modes with a focus on big data, incorporating a variety of statistica and machine earning techniques Designing and impementing various enterprise search appications such as Easticsearch and Spunk for cient requirements Work in an Agie, coaborative environment, partnering with other scientists, engineers, consutants and database administrators of a backgrounds and discipines to bring anaytica rigor and statistica methods to the chaenges of predicting behaviours. Buid teams or writing programs to ceanse and integrate data in an efficient and reusabe manner, deveoping predictive or prescriptive modes, and evauating modeing resuts Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Deveop/Convert the database (Hadoop to GCP) of the specific objects (tabes, views, procedures, functions, triggers, etc.) from one database to another database patform Impementation of a specific Data Repication mechanism (CDC, fie data transfer, buk data transfer, etc.). Expose data as API Participation in modernization roadmap journey Anayze discovery and anaysis outcomes Lead discovery and anaysis workshops/paybacks Identification of the appications dependencies, source, and target database incompatibiities. Anayze the non-functiona requirements (security, HA, RTO/RPO, storage, compute, network, performance bench, etc.). Prepare the effort estimates, WBS, staffing pan, RACI, RAID etc. . Leads the team to adopt right toos for various migration and modernization method Preferred technica and professiona experience You thrive on teamwork and have exceent verba and written communication skis. Abiity to communicate with interna and externa cients to understand and define business needs, providing anaytica soutions Abiity to communicate resuts to technica and non-technica audiences

Posted 1 month ago

Apply

6.0 - 7.0 years

14 - 17 Lacs

Hyderabad

Work from Office

As Data Engineer, you wi deveop, maintain, evauate and test big data soutions. You wi be invoved in the deveopment of data soutions using Spark Framework with Python or Scaa on Hadoop and Azure Coud Data Patform Responsibiities: Experienced in buiding data pipeines to Ingest, process, and transform data from fies, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Coud Data Patform or HDFS Experienced in deveop efficient software code for mutipe use cases everaging Spark Framework / using Python or Scaa and Big Data technoogies for various use cases buit on the patform Experience in deveoping streaming pipeines Experience to work with Hadoop / Azure eco system components to impement scaabe soutions to meet the ever-increasing data voumes, using big data/coud technoogies Apache Spark, Kafka, any Coud computing etc Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Tota 6 - 7+ years of experience in Data Management (DW, DL, Data Patform, Lakehouse) and Data Engineering skis Minimum 4+ years of experience in Big Data technoogies with extensive data engineering experience in Spark / Python or Scaa; Minimum 3 years of experience on Coud Data Patforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to exceent SQL skis Preferred technica and professiona experience Certification in Azure and Data Bricks or Coudera Spark Certified deveopers

Posted 1 month ago

Apply

2.0 - 5.0 years

14 - 17 Lacs

Mumbai

Work from Office

As a Data Engineer at IBM, you' pay a vita roe in the deveopment, design of appication, provide reguar support/guidance to project teams on compex coding, issue resoution and execution. Your primary responsibiities incude: Lead the design and construction of new soutions using the atest technoogies, aways ooking to add business vaue and meet user requirements. Strive for continuous improvements by testing the buid soution and working under an agie framework. Discover and impement the atest technoogies trends to maximize and buid creative soutions Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Experience with Apache Spark (PySpark)In-depth knowedge of Spark’s architecture, core APIs, and PySpark for distributed data processing. Big Data TechnoogiesFamiiarity with Hadoop, HDFS, Kafka, and other big data toos. Data Engineering Skis: Strong understanding of ETL pipeines, data modeing, and data warehousing concepts. Strong proficiency in PythonExpertise in Python programming with a focus on data processing and manipuation. Data Processing FrameworksKnowedge of data processing ibraries such as Pandas, NumPy. SQL ProficiencyExperience writing optimized SQL queries for arge-scae data anaysis and transformation. Coud PatformsExperience working with coud patforms ike AWS, Azure, or GCP, incuding using coud storage systems Preferred technica and professiona experience Define, drive, and impement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technoogy teams incuding appication deveopment, enterprise architecture, testing services, network engineering, Good to have detection and prevention toos for Company products and Patform and customer-facing

Posted 1 month ago

Apply

5.0 - 10.0 years

14 - 17 Lacs

Navi Mumbai

Work from Office

As a Big Data Engineer, you wi deveop, maintain, evauate, and test big data soutions. You wi be invoved in data engineering activities ike creating pipeines/workfows for Source to Target and impementing soutions that tacke the cients needs. Your primary responsibiities incude: Design, buid, optimize and support new and existing data modes and ETL processes based on our cients business requirements. Buid, depoy and manage data infrastructure that can adequatey hande the needs of a rapidy growing data driven organization. Coordinate data access and security to enabe data scientists and anaysts to easiy access to data whenever they need too Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scaa ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Deveoped Python and pyspark programs for data anaysis. Good working experience with python to deveop Custom Framework for generating of rues (just ike rues engine). Deveoped Python code to gather the data from HBase and designs the soution to impement using Pyspark. Apache Spark DataFrames/RDD's were used to appy business transformations and utiized Hive Context objects to perform read/write operations Preferred technica and professiona experience Understanding of Devops. Experience in buiding scaabe end-to-end data ingestion and processing soutions Experience with object-oriented and/or functiona programming anguages, such as Python, Java and Scaa

Posted 1 month ago

Apply

2.0 - 5.0 years

14 - 17 Lacs

Hyderabad

Work from Office

As a Data Engineer at IBM, you' pay a vita roe in the deveopment, design of appication, provide reguar support/guidance to project teams on compex coding, issue resoution and execution. Your primary responsibiities incude: Lead the design and construction of new soutions using the atest technoogies, aways ooking to add business vaue and meet user requirements. Strive for continuous improvements by testing the buid soution and working under an agie framework. Discover and impement the atest technoogies trends to maximize and buid creative soutions Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Experience with Apache Spark (PySpark)In-depth knowedge of Spark’s architecture, core APIs, and PySpark for distributed data processing. Big Data TechnoogiesFamiiarity with Hadoop, HDFS, Kafka, and other big data toos. Data Engineering Skis: Strong understanding of ETL pipeines, data modeing, and data warehousing concepts. Strong proficiency in PythonExpertise in Python programming with a focus on data processing and manipuation. Data Processing FrameworksKnowedge of data processing ibraries such as Pandas, NumPy. SQL ProficiencyExperience writing optimized SQL queries for arge-scae data anaysis and transformation. Coud PatformsExperience working with coud patforms ike AWS, Azure, or GCP, incuding using coud storage systems Preferred technica and professiona experience Define, drive, and impement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technoogy teams incuding appication deveopment, enterprise architecture, testing services, network engineering, Good to have detection and prevention toos for Company products and Patform and customer-facing

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

Pune

Work from Office

As an Data Engineer at IBM you wi harness the power of data to unvei captivating stories and intricate patterns. You' contribute to data gathering, storage, and both batch and rea-time processing. Coaborating cosey with diverse teams, you' pay an important roe in deciding the most suitabe data management systems and identifying the crucia data required for insightfu anaysis. As a Data Engineer, you' tacke obstaces reated to database integration and untange compex, unstructured data sets. In this roe, your responsibiities may incude: Impementing and vaidating predictive modes as we as creating and maintain statistica modes with a focus on big data, incorporating a variety of statistica and machine earning techniques Designing and impementing various enterprise search appications such as Easticsearch and Spunk for cient requirements Work in an Agie, coaborative environment, partnering with other scientists, engineers, consutants and database administrators of a backgrounds and discipines to bring anaytica rigor and statistica methods to the chaenges of predicting behaviours. Buid teams or writing programs to ceanse and integrate data in an efficient and reusabe manner, deveoping predictive or prescriptive modes, and evauating modeing resuts Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise 4+ years of experience in data modeing, data architecture. Proficiency in data modeing toos Erwin, IBM Infosphere Data Architect and database management systems Famiiarity with different data modes ike reationa, dimensiona and NoSQL databases. Understanding of business processes and how data supports business decision making. Strong understanding of database design principes, data warehousing concepts, and data governance practices Preferred technica and professiona experience Exceent anaytica and probem-soving skis with a keen attention to detai. Abiity to work coaborativey in a team environment and manage mutipe projects simutaneousy. Knowedge of programming anguages such as SQL

Posted 1 month ago

Apply

5.0 - 10.0 years

14 - 17 Lacs

Mumbai

Work from Office

As a Big Data Engineer, you wi deveop, maintain, evauate, and test big data soutions. You wi be invoved in data engineering activities ike creating pipeines/workfows for Source to Target and impementing soutions that tacke the cients needs. Your primary responsibiities incude: Design, buid, optimize and support new and existing data modes and ETL processes based on our cients business requirements. Buid, depoy and manage data infrastructure that can adequatey hande the needs of a rapidy growing data driven organization. Coordinate data access and security to enabe data scientists and anaysts to easiy access to data whenever they need too. Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scaa ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Deveoped Python and pyspark programs for data anaysis. Good working experience with python to deveop Custom Framework for generating of rues (just ike rues engine). Deveoped Python code to gather the data from HBase and designs the soution to impement using Pyspark. Apache Spark DataFrames/RDD's were used to appy business transformations and utiized Hive Context objects to perform read/write operations. Preferred technica and professiona experience Understanding of Devops. Experience in buiding scaabe end-to-end data ingestion and processing soutions Experience with object-oriented and/or functiona programming anguages, such as Python, Java and Scaa

Posted 1 month ago

Apply

2.0 - 6.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Design, deveop, and manage our data infrastructure on AWS, with a focus on data warehousing soutions. Write efficient, compex SQL queries for data extraction, transformation, and oading. Utiize DBT for data modeing and transformation. Use Python for data engineering tasks, demonstrating strong work experience in this area. Impement scheduing toos ike Airfow, Contro M, or she scripting to automate data processes and workfows. Participate in an Agie environment, adapting quicky to changing priorities and requirements Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Mandatory Skis: Candidate shoud have worked on traditiona Data warehousing with any database (Orace or DB2 or SQL Server) (Redshift optiona) Candidate shoud have string SQL skis and abiity to write compex queries using anaytica functions. Prior working experience on AWS patform Python programming experience for data engineering .Experience in PySpark/Spark Working knowedge of Data Pipeines too Airfow The beow skis are nice to haveExperience with DBT, Exposure to working in an Agie environment. Proven abiity to troubeshoot and resove production issues under a DevOps mode A track record of continuousy identify opportunities to improve the performance and quaity of your ecosystem. Experience monitoring performance and ensuring Preferred technica and professiona experience Knowedge of DBT for data modeing and transformation is a pus. Experience with PySpark or Spark is highy desirabe

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies