Jobs
Interviews

916 Sqoop Jobs - Page 10

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of this role is to provide significant technical expertise in architecture planning and design of the concerned tower (platform, database, middleware, backup etc) as well as managing its day-to-day operations Do Provide adequate support in architecture planning, migration & installation for new projects in own tower (platform/dbase/ middleware/ backup) Lead the structural/ architectural design of a platform/ middleware/ database/ back up etc. according to various system requirements to ensure a highly scalable and extensible solution Conduct technology capacity planning by reviewing the current and future requirements Utilize and leverage the new features of all underlying technologies to ensure smooth functioning of the installed databases and applications/ platforms, as applicable Strategize & implement disaster recovery plans and create and implement backup and recovery plans Manage the day-to-day operations of the tower Manage day-to-day operations by troubleshooting any issues, conducting root cause analysis (RCA) and developing fixes to avoid similar issues. Plan for and manage upgradations, migration, maintenance, backup, installation and configuration functions for own tower Review the technical performance of own tower and deploy ways to improve efficiency, fine tune performance and reduce performance challenges Develop shift roster for the team to ensure no disruption in the tower Create and update SOPs, Data Responsibility Matrices, operations manuals, daily test plans, data architecture guidance etc. Provide weekly status reports to the client leadership team, internal stakeholders on database activities w.r.t. progress, updates, status, and next steps Leverage technology to develop Service Improvement Plan (SIP) through automation and other initiatives for higher efficiency and effectiveness Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipros standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Ensure that organizational programs like Performance Nxt are well understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation Mandatory Skills: Big Data Consulting. Experience: 5-8 Years.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Required Skills & Experience: 7+ years of QA experience, with 5+ years in Facets US Healthcare Experience especially in Claims processing, Membership, Providers and Utilization Managemen Strong experience with Facets application testing (UI, DB and Batch) Hands-on experience with Facets Modernization projects Familiarity with tools like JIRA or Rall Excellent communication and leadership skills. Experience: Preferred Skills Experience with Apache Airflow (nice to have) Knowledge of cloud platforms and migration strategies Certification in QA methodologies or Agile practices Mandatory Skills: HC - Payor. Experience:5-8 Years.

Posted 3 weeks ago

Apply

8.0 - 10.0 years

13 - 18 Lacs

Chandigarh

Work from Office

Job Description Full-stack Architect Experience 8 - 10 years Architect, design, and oversee the development of full-stack applications using modern JS frameworks and cloud-native tools. Lead microservice architecture design, ensuring system scalability, reliability, and performance. Evaluate and implement AWS services (Lambda, ECS, Glue, Aurora, API Gateway, etc.) for backend solutions. Provide technical leadership to engineering teams across all layers (frontend, backend, database). Guide and review code, perform performance optimization, and define coding standards. Collaborate with DevOps and Data teams to integrate services (Redshift, OpenSearch, Batch). Translate business needs into technical solutions and communicate with cross-functional stakeholders.

Posted 3 weeks ago

Apply

4.0 - 6.0 years

8 - 9 Lacs

Pune

Work from Office

Diverse Lynx is looking for Azure Data Engineer to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

12 - 16 Lacs

Chennai

Work from Office

Project description We need a Senior Python and Pyspark Developer to work for a leading investment bank client. Responsibilities Develop software applications based on business requirements Maintain software applications and make enhancements according to project specifications Participate in requirement analysis, design, development, testing, and implementation activities Propose new techniques and technologies for software development. Perform unit testing and user acceptance testing to evaluate application functionality Ensure to complete the assigned development tasks within the deadlines Work in compliance with coding standards and best practices Provide assistance to Junior Developers when needed. Perform code reviews and recommend improvements. Review business requirements and recommend changes to develop reliable applications. Develop coding documentation and other technical specifications for assigned projects. Act as primary contact for development queries and concerns. Analyze and resolve development issues accurately. Skills Must have 8+ years of experience in data intensive Pyspark development. Experience as a core Python developer. Experience developing Classes, OOPS, exception handling, parallel processing . Strong knowledge of DB connectivity, data loading , transformation, calculation. Extensive experience in Pandas/Numpy dataframes, slicing, data wrangling, aggregations. Lambda Functions, Decorators. Vector operations on Pandas dataframes /series. Application of applymap, apply, map functions. Concurrency and error handling data pipeline batch of size [1-10 gb]. Ability to understand business requirements and translate them into technical requirements. Ability to design architecture of data pipeline for concurrent data processing. Familiar with creating/designing RESTful services and APIs. Familiar with application unit tests. Working with Git source control Service-orientated architecture, including the ability to consider integrations with other applications and services. Debugging application. Nice to have Knowledge of web backend technology Django, Python, PostgreSQL. Apache Airflow Atlassian Jira Understanding of Financial Markets Asset Classes (FX, FI, Equities, Rates, Commodities & Credit), various trade types (OTC, exchange traded, Spot, Forward, Swap, Options) and related systems is a plus Surveillance domain knowledge, regulations (MAR, MIFID, CAT, Dodd Frank) and related Systems knowledge is certainly a plus

Posted 3 weeks ago

Apply

7.0 - 12.0 years

11 - 15 Lacs

Gurugram

Work from Office

Project description We are looking for an experienced Data Engineer to contribute to the design, development, and maintenance of our database systems. This role will work closely with our software development and IT teams to ensure the effective implementation and management of database solutions that align with client's business objectives. Responsibilities The successful candidate would be responsible for managing technology in projects and providing technical guidance/solutions for work completion (1.) To be responsible for providing technical guidance/solutions (2.) To ensure process compliance in the assigned module and participate in technical discussions/reviews (3.) To prepare and submit status reports for minimizing exposure and risks on the project or closure of escalations (4.) Being self-organized, focused on develop on time and quality software Skills Must have At least 7 years of experience in development in Data specific projects. Must have working knowledge of streaming data Kafka Framework (kSQL/Mirror Maker etc) Strong programming skills in at least one of these programming language Groovy/Java Good knowledge of Data Structure, ETL Design, and storage. Must have worked in streaming data environments and pipelines Experience working in near real-time/Streaming Data pipeline development using Apache Spark/Streamsets/ Apache NIFI or similar frameworks Nice to have N/A

Posted 3 weeks ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: DataBricks - Data Engineering. Experience:5-8 Years.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: PySpark. Experience:5-8 Years.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: PySpark. Experience:5-8 Years.

Posted 3 weeks ago

Apply

10.0 - 15.0 years

6 - 10 Lacs

Pune

Work from Office

: Job Title Pentaho Developer Corporate TitleAssociate LocationPune, India Role Description Developer with atleast 10 years of experience. Experience in developing and deploying Pentaho based applications. Need to work on Data Integration project. Mostly batch oriented using Pentaho 9.3, oracle, hadoop, python, pyspark and similar technologies Primary Skills: Pentaho, SQL, Oracle, Python, Pyspark, shell scripting. Experience: Minimum of 10 years of on-hands experience in development projects. What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Hands-on Developer. Pentaho Jobs/Transformation development role. Participating in agile development projects for batch data ingestion. Fast learner into order to understand the current data landscape and existing Spark and Hive HQL program to make enhancement. Bug fixing, enhancements for the existing applications Software upgrades and maintenance mandatory client connectivity upgrades Closure of Audit and operating control issues Migration of out of support application software Migration of applications to cloud Migration of applications from one technology to other Your skills and experience Pentaho, SQL, Oracle, Python, Pyspark, shell scripting Basic knowledge in Unix shell scripting is a must. Good in writing Database SQLs to process the batch jobs. Analytical SQL Pentaho Big data components Hadoop - Hive and Spark experience/knowledge Know-how with cloud-based infrastructure. Expertise in unit testing. How well support you . . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We at DWS are committed to creating a diverse and inclusive workplace, one that embraces dialogue and diverse views, and treats everyone fairly to drive a high-performance culture. The value we create for our clients and investors is based on our ability to bring together various perspectives from all over the world and from different backgrounds. It is our experience that teams perform better and deliver improved outcomes when they are able to incorporate a wide range of perspectives. We call this #ConnectingTheDots.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

4 - 7 Lacs

Pune

Work from Office

JC-67103 Band -B2,B3 Location-Chennai, Coimbatore, Bangalore, pun Key skills Azure Data Factory (primary) , Azure Data bricks Spark (PySpark, SQL Experience - 7 to 10 Years Must-have skills Cloud certified in one of these categories Azure Data Engineer Azure Data Factory , Azure Data bricks Spark (PySpark or scala), SQL, DATA Ingestion, Curation . Semantic Modelling/ Optimization of data model to work within Rahona Experience in Azure ingestion from on-prem source, e.g. mainframe, SQL server, Oracle. Experience in Sqoop / Hadoop Microsoft Excel (for metadata files with requirements for ingestion) Any other certificate in Azure/AWS/GCP and data engineering hands-on experience in cloud Strong Programming skills with at least one of Python, Scala or Java Strong SQL skills ( T-SQL or PL-SQL) Data files movement via mailbox Source-code versioning/promotion tools, e.g. Git/Jenkins Orchestration tools, e.g. Autosys, Oozie Source-code versioning with Git. Nice-to-have skills Experience working with mainframe files Experience in Agile environment, JIRA/Confluence tools. Mandatory Skills: DataBricks - Data Engineering. Experience: 5-8 Years.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Chennai

Work from Office

The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: DataBricks - Data Engineering.: Experience: 5-8 Years.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Pune

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: DataBricks - Data Engineering. Experience: 5-8 Years.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Datastage ETL Testing. Experience: 5-8 Years.

Posted 3 weeks ago

Apply

16.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Principal Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Senior Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 16- 20 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills Possess the innate quality to become the go to person for any marketing presales and solution accelerator within the practise. What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 weeks ago

Apply

16.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Principal Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Senior Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 16- 20 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills Possess the innate quality to become the go to person for any marketing presales and solution accelerator within the practise. What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 weeks ago

Apply

16.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Principal Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Senior Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 16- 20 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills Possess the innate quality to become the go to person for any marketing presales and solution accelerator within the practise. What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Big Data Developer, you will be responsible for leveraging your strong experience in big data technologies and associated tools such as Hadoop, Unix, HDFS, Hive, and Impala. Your proficiency in using Spark/Scala and experience with data Import/Export using Sqoop or similar tools will be crucial in this role. Additionally, you will be expected to have experience with tools like Airflow, Jenkins, or similar automation tools. Your excellent knowledge of SQL Server and database structures will be essential for writing and optimizing T-SQL queries and stored procedures. Experience working with Jira, Confluence, and GitLab will also be beneficial in this role. Your organizational skills and ability to handle multiple activities with changing priorities simultaneously will be highly valued. As part of the delivery team, your primary responsibilities will include ensuring effective design, development, validation, and support activities to meet client satisfaction in the technology domain. You will gather requirements, understand client needs, and translate them into system requirements. Additionally, you will play a key role in estimating work requirements and providing project estimations to Technology Leads and Project Managers. You will be a key contributor to building efficient programs/systems and collaborating with other Big Data developers to ensure consistency in data solutions. Your ability to partner with the business community, perform technology research, and evaluate new technologies will be crucial in enhancing the overall capability of the analytics technology stack. Key Responsibilities: - Code, test, and document new or modified data systems to create robust and scalable applications for data analytics. - Work with other Big Data developers to ensure consistency in data solutions. - Partner with the business community to understand requirements, determine training needs, and deliver user training sessions. - Perform technology and product research to define requirements, resolve issues, and enhance the analytics technology stack. - Evaluate and provide feedback on future technologies and new releases/upgrades. Job Specific Knowledge: - Support Big Data and batch/real-time analytical solutions using transformational technologies. - Work on multiple projects as a technical team member or drive user requirement analysis, design, development, testing, and automation tools. Professional Attributes: - Good communication skills. - Team player willing to collaborate throughout all phases of development, testing, and deployment. - Ability to solve problems and meet deadlines with minimal supervision. If you believe you have the skills and experience to contribute effectively to our clients" digital transformation journey, we welcome you to join our team.,

Posted 3 weeks ago

Apply

15.0 - 20.0 years

3 - 6 Lacs

Bengaluru

Work from Office

About The Role Project Role : Data Science Practitioner Project Role Description : Formulating, design and deliver AI/ML-based decision-making frameworks and models for business outcomes. Measure and justify AI/ML based solution values. Must have skills : Natural Language Processing (NLP) Good to have skills : NAMinimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Science Practitioner, you will be engaged in formulating, designing, and delivering AI and machine learning-based decision-making frameworks and models that drive business outcomes. Your typical day will involve collaborating with various teams to measure and justify the value of AI and machine learning solutions, ensuring that they align with organizational goals and deliver tangible results. You will also be responsible for analyzing complex data sets, deriving insights, and presenting findings to stakeholders to support informed decision-making. Roles & Responsibilities:- Expected to be a Subject Matter Expert with deep knowledge and experience.- Should have influencing and advisory skills.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and training sessions to enhance team capabilities in AI and machine learning.- Continuously evaluate and improve existing AI and machine learning models to ensure optimal performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Natural Language Processing (NLP).- Strong understanding of machine learning algorithms and their applications.- Experience with data analysis and visualization tools.- Ability to communicate complex technical concepts to non-technical stakeholders.- Familiarity with programming languages such as Python or R. Additional Information:- The candidate should have minimum 15 years of experience in Natural Language Processing (NLP).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Bengaluru

Work from Office

About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AIX System Administration Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time educationYour Role and ResponsibilitiesAs AIX Administrator, you are responsible for installation, implementation, customization, operation, recovery and performance tuning with proven knowledge of the fundamental concepts and technologies associated with AIX Operating Systems.Responsibilities:Installation, configuration and troubleshooting of AIX & Unix Operating systemClustering and High availability managementUNIX, AIX Server Support:Designing, troubleshooting and storage implementation, articulate standard methodologies for during implementationSystems running on UNIX / AIX platforms and OS clustering, partitioning and virtualizationHandle day-to-day UNIX-AIX operating system installation, migration and break-fix supportSAN S/W and storage administration and integration with operating systemsRaising and working with PMR teamInstalling, configuring and maintaining IBM AIX and Unix ServersInstallation and configuration of VirtualizationInstallation and configuration of cluster environment using HACMPConfiguration and administration of Logical volume managerPatch and Package administrationWriting shell scripts to accomplish day to day system administration taskConfiguring and supporting Domains, LPARs, DLPARsAdministrator & configure various FS like JFS, VxFS, Pseudo FSTroubleshooting Hardware and Operating system related issueCapacity planning and fine tune system for optimal performanceUnderstanding of SAN and NAS storage.Administration of NIS or LDAP environmentRequired Technical and Professional ExpertiseMinimum 7 years of experience in IT Industry Unix AdministrationAIX administration and Linux administration (Redhat/Suse/Ubuntu), Automation experience in OS Patching and upgradesAbility to work independently with Vendor to resolve issues (OS and Hardware)Proven working experience in installing, configuring middleware and troubleshooting Linux/AIX based environmentKnowledge on SAN and NAS storage Experience with Physical, Virtual and containerized environmentPreferred Technical and Professional ExpertiseProactive monitoring and capacity planning experienceKnowledge on Ansible or other automation tool is must Scripting KnowledgeBash/Python/Perl to automate day to day activitiesWilling to adopt new technology Expertise in Cluster configuration and troubleshootingWorking knowledge of Incident and Change Management Qualification 15 years full time education

Posted 3 weeks ago

Apply

12.0 - 15.0 years

3 - 6 Lacs

Bengaluru

Work from Office

About The Role Project Role : Data Science Practitioner Project Role Description : Formulating, design and deliver AI/ML-based decision-making frameworks and models for business outcomes. Measure and justify AI/ML based solution values. Must have skills : Natural Language Processing (NLP) Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Science Practitioner, you will be engaged in formulating, designing, and delivering AI and machine learning-based decision-making frameworks and models that drive business outcomes. Your typical day will involve collaborating with various teams to measure and justify the value of AI and machine learning solutions, ensuring that they align with organizational goals and deliver tangible results. You will also be responsible for analyzing complex data sets, deriving insights, and presenting findings to stakeholders, all while fostering an environment of innovation and continuous improvement. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing and mentorship within the team to enhance overall performance.- Develop and implement best practices for data analysis and model development to ensure high-quality outputs. Professional & Technical Skills: - Must To Have Skills: Proficiency in Natural Language Processing (NLP).- Strong understanding of machine learning algorithms and their applications.- Experience with data preprocessing techniques and text analytics.- Familiarity with programming languages such as Python or R for data analysis.- Ability to communicate complex technical concepts to non-technical stakeholders. Additional Information:- The candidate should have minimum 12 years of experience in Natural Language Processing (NLP).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

2.0 - 7.0 years

18 - 22 Lacs

Bengaluru

Work from Office

About The Role Job Title Data Science/Data Engineering ML9 - Sales Excellence COE - Data Engineering Specialist Management Level :ML9 Location:Open Must have skills: strong GCP cloud technology experience, Big query, Data Science basics. Good to have skills: Build and maintain data models from different sources. Experience: Minimum 5 year(s) of experience is required Educational Qualification: Graduate/Post graduate Job Summary :The Center of Excellence (COE) makes sure that the sales and pricing methods and offerings of Sales Excellence are effective. The COE supports salespeople through its business partners and Analytics and Sales Operations teams. The Data Engineer helps manage data sources and environments, utilizing large data sets and maintaining their integrity to create models and apps that deliver insights to the organization. Roles & Responsibilities: Build and manage data models that bring together data from different sources. Understand the existing Data Model in SQL Server and help redesign/migrate in GCP Bigquery. Help consolidate and cleanse data for use by the modeling and development teams. Structure data for use in analytics applications. Lead a team of Data Engineers effectively. Professional & Technical Skills: A bachelors degree or equivalent A Minimum of2 years of strong GCP cloud technology experience A minimum of 2 years Advanced SQL knowledge and experience working with relational databases A minimum of 2 years Familiarity and hands on experience in different SQL objects like stored procedures, functions, views etc., A minimum of 2 years Building of data flow components and processing systems to extract, transform, load and integrate data from various sources. A basic knowledge of Data Science models and tools. Additional Information: Extra credit if you have: Understanding of sales processes and systems. Masters degree in a technical field. Experience with Python. Experience with quality assurance processes. Experience in project management. You May Also Need: Ability to work flexible hours according to business needs. Must have good internet connection and a distraction-free environment for working at home, in accordance with local guidelines. About Our Company | Accenture (do not remove the hyperlink)Qualification Experience: Minimum 5 year(s) of experience is required Educational Qualification: Graduate/Post graduate

Posted 3 weeks ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Hyderabad

Work from Office

About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : PySpark Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will oversee the development process and ensure successful project delivery. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Lead the application development team in designing and building applications.- Act as the primary point of contact for project stakeholders.- Provide technical guidance and mentorship to team members.- Ensure timely delivery of high-quality software solutions.- Collaborate with cross-functional teams to drive project success. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of big data processing and analysis.- Experience with distributed computing frameworks.- Hands-on experience in building scalable data pipelines.- Knowledge of cloud platforms for data processing. Additional Information:- The candidate should have a minimum of 3 years of experience in PySpark.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Pune

Work from Office

About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse, PySpark, Core Banking Good to have skills : AWS BigDataMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data processing workflow. Your role will be pivotal in enhancing the efficiency and reliability of data operations within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processing workflows to optimize performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse, Core Banking, PySpark.- Good To Have Skills: Experience with AWS BigData.- Strong understanding of data modeling and database design principles.- Experience with data integration tools and ETL processes.- Familiarity with cloud-based data solutions and architectures. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Hyderabad

Work from Office

About The Role Project Role : Data Governance Practitioner Project Role Description : Establish and enforce data governance policies to ensure the accuracy, integrity, and security of organizational data. Collaborate with key stakeholders to define data standards, facilitate effective data collection, storage, access, and usage; and drive data stewardship initiatives for comprehensive and effective data governance. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Governance Practitioner, you will establish and enforce data governance policies to ensure the accuracy, integrity, and security of organizational data. Your typical day will involve collaborating with key stakeholders to define data standards, facilitating effective data collection, storage, access, and usage, and driving data stewardship initiatives for comprehensive and effective data governance. You will engage in discussions that shape the data landscape of the organization, ensuring that data practices align with established policies and standards. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the development and implementation of data governance frameworks and policies.- Monitor compliance with data governance policies and report on data quality metrics. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data governance principles and best practices.- Experience with data quality assessment and improvement techniques.- Familiarity with data management tools and technologies.- Ability to communicate complex data concepts to non-technical stakeholders. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies