Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 10.0 years
7 - 9 Lacs
Chennai, Tamil Nadu, India
On-site
What you will be doing: Build the framework and features for business as part of Data Services team Design, Develop and Test Big Data Ecosystem components and Solutions (Hive, MapReduce, SQOOP, Spark, HDFS, HBase, Kafka, Impala and Flume for data storage and analysis or other NoSQL technologies) Ability to provide technical support and technical quality control for all projects across all phases Program implementation skills including project portfolio management and the planning, reporting and directing the work of others across multiple implementation projects concurrently Understand the data structure how it is stored, what are the technologies used to store the data, how to fetch the available data in multiple tables, report design by using AG GRID. Day to day operations: Ensure All required data sets received and processed, adhoc data requirements deliverables, analysis of the existing listed reports in the UI. What you bring: 7-10 years of experience in Data Warehouse or Reporting Solution, skill -set in Python, Spark, Hive, Impala, Hadoop Architecture and SQL Query. Good communication and presentation skills. Strong understanding of data flow, process and transformation. Good to have Knowledge of Linux. Good to have Banking domain experience. Knowledge of In-memory computation will be added advantage Experience in maintaining technical code standards and version control system. Fluency in most IT business architecture areas including business process design and governance. Proficiency in fine tuning the jobs for better performance
Posted 1 week ago
10.0 - 12.0 years
10 - 12 Lacs
Cochin, Kerala, India
On-site
EY GDS Data and Analytics (D&A) Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We're looking for Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your key responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills and attributes for success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security[on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Buildand Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you'll also have Strong project management skills Client management skills Solutioning skills What we look for People with technical experience and enthusiasm to learn new things in this fast-moving environment
Posted 2 weeks ago
2.0 - 5.0 years
3 - 7 Lacs
Ahmedabad
Work from Office
Roles and Responsibility : Develop expertise in the different upstream data stores and systems. Design, develop and maintain data integration pipelines for growing data sets and product offerings. Build testing and QA plans for data pipelines. Skillful in ETL Data Engineering. Build data validation testing frameworks to ensure high data quality and integrity. Write and maintain documentation on data pipelines and schemas Extensive experience in data integration tools to analyse root cause and provide a fix for production and development issues. Good in Understanding of Hive, Spark, Hadoop architecture and optimization.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France